Posts Tagged ‘business history’
“You are where you are today because you stand on somebody’s shoulders”*…
Leon Prieto and Simone Phipps, two management professors who are husband and wife and and the co-authors of African American Management History, have been working to fill in the gaps in business history left by the omission of Black business stories. The pair argue that the ideas supported by African American managers during the first few decades of the 20th century, a relative golden age for Black business, hold lessons that are relevant in this century– perhaps especially the example of Charles Clinton Spaulding, who led North Carolina Mutual Life Insurance Company, the largest African American life insurance company of the times, for 50 years until his death in 1952…
Several years ago, reading a book about Black business history, and then checking the bibliography for original sources, Prieto discovered a kind of manifesto Spaulding had written in 1927 for the Pittsburgh Courier, the largest Black newspaper of the era, reaching hundreds of thousands of readers. Under the headline “The Administration of Big Business,” Spaulding shared his views on running a major firm. To his mind, the eight fundamentals of operations that demanded a leader’s attention were: cooperating and teamwork; authority and responsibility; division of labor; adequate manpower; adequate capital; feasibility analysis; advertising budget; and conflict resolution.
His article, the scholars note, was published 20 years before similar theories about the functions of management by Henri Fayol, a French theorist and textbook mainstay, were translated for American readers. Despite the overlap in the two men’s thinking, only Fayol has been awarded institutional recognition. (The podcast Talking About Organizations, which invited Prieto and Phipps to be guests on the show last year, has transcribed Spaulding’s article in full, here.)
…
In the writings and speeches in Spaulding’s archives, housed at Duke University, Phipps and Prieto discovered an unrelenting call for cooperation and consensus-building within organizations, and an emphasis on the symbiotic relationship between a company and the world outside its doors.
Spaulding’s devotion to a collective style of working and to corporate social responsibility was not an isolated case of the era. Nor did it materialize strictly as a response to the times, the pair assert. Rather, they hypothesize that the cooperative model that was popular among Black businesses then—and which infused the way free-market enterprises operated in the Black Wall Streets of Durham and other American cities like Tulsa, Oklahoma—grew out of a much older African philosophy called Ubuntu, a Nguni Bantu word meaning humanity, derived from an idiom that’s sometimes translated as “I am because we are” or “a person is a person through other persons.” Ubuntu as a world view that stresses our interconnectedness was popularized globally in the 1960s, primarily by Desmond Tutu, the South African archbishop emeritus and Nobel Peace Prize-winning human rights activist.
The sense that ubuntu defines our human experience is common in several African cultures, Prieto says, and manifests in a range of cooperative financial models that flourish across the African diaspora. (For example, he had grown up contributing to sou sou, or a savings club, he tells his students in lectures, and it was a sou sou that allowed him to purchase the plane ticket that brought him the US.) It may not have been called ubuntu, but that moral code survived as a shared value among Africans enslaved in the US, Prieto and Phipps say…
Stories from which we can learn: “The history of Black management reveals an overlooked form of capitalism,” from @qz.
* “You are where you are today because you stand on somebody’s shoulders. And wherever you are heading, you cannot get there by yourself. If you stand on the shoulders of others, you have a reciprocal responsibility to live your life so that others may stand on your shoulders. It’s the quid pro quo of life. We exist temporarily through what we take, but we live forever through what we give.” – Vernon Jordan
###
As we rethink the rules, we might recall that it was on this date in 1950 that Dr. Carter G. Woodson, a noted, historian, journalist, author and the founder of the Association for the Study of African American Life and History, began “Negro History Week”– the forerunner to Black History Month.
“They said I was a valued customer, now they send me hate mail”*…
Is shopping therapy… or an occasion for therapy?…
… Throughout the coronavirus pandemic, videos of irate anti-maskers screaming, throwing things, and assaulting employees at big-box and grocery stores have become a social-media mainstay. As Americans return en masse to more types of in-person commerce, the situation only seems to be declining. At its most violent extreme, workers have been hospitalized or killed. Eight Trader Joe’s employees were injured in one such attack in New York, and in Georgia, a grocery-store cashier was shot over a mask dispute. Far more frequent are the accounts of short-fused shoppers becoming verbally abusive or otherwise degrading over slow service or sold-out goods. Earlier this month, a restaurant on Cape Cod reportedly was so overwhelmed with rude customers that it shut down for a “day of kindness.”
America’s ultra-tense political climate, together with the accumulated personal and economic traumas of the pandemic, have helped spur this animosity, which was already intense and common in the United States. But it’s hardly the only reason that much of the country has decided to take out its pandemic frustrations on the customer-service desk. For generations, American shoppers have been trained to be nightmares. The pandemic has shown just how desperately the consumer class clings to the feeling of being served.
The experience of buying a new television or a double cheeseburger in a store has gotten worse in your lifetime. It’s gotten worse for the people selling TVs and burgers too. The most immediate culprit is decades of cost-cutting; by increasing surveillance and pressure on workers during shifts, reducing their hours and benefits, and not replacing those who quit, executives can shine up a business’s balance sheet in a hurry. Sometimes, you can see these shifts happening in real time, as with pandemic-era QR-code-ordering in restaurants, which allows them to reduce staff—and which is likely to stick around. Wages and resources dwindle, and more expensive and experienced workers get replaced with fewer and more poorly trained new hires. When customers can’t find anyone to help them or have to wait too long in line, they take it out on whichever overburdened employee they eventually hunt down.
This dynamic is exacerbated by the fact that the United States has more service workers than ever before, doing more types of labor, spread thin across the economy—Uber drivers; day-care workers; hair stylists; call-center operators; DoorDash “dashers”; Instacart shoppers; home health aides; Amazon’s fleet of delivery people, with your cases of toilet paper and new pajamas in the trunk of their own car. In 2019, one in five American workers was employed in retail, food service, or hospitality; even more are now engaged in service work of some kind.
For people currently alive and shopping in America, this economic arrangement is so all-encompassing that it can feel like the natural order of things. But customer service as a concept is an invention of the past 150 years. At the dawn of the second Industrial Revolution, most people grew or made much of what they used themselves; the rest came from general stores or peddlers. But as the production of food and material goods centralized and rapidly expanded, commerce reached a scale that the country’s existing stores were ill-equipped to handle, according to the historian Susan Strasser, the author of Satisfaction Guaranteed: The Making of the American Mass Market. Manufacturers needed ways to distribute their newly enormous outputs and educate the public on the wonder of all their novel options. Americans, in short, had to be taught how to shop.
In this void grew department stores, the very first of which appeared in the United States in the 1820s. The model proliferated in cities as the 20th century neared and industrial manufacturing expanded. By consolidating sales under corporate auspices in much the same way that factories consolidated production, businesses such as Wanamaker’s, Macy’s, and Marshall Field’s hinted at the astonishing ways American life would change over the next century. But consolidation also created a public-image issue, argues the historian William Leach in Land of Desire: Merchants, Power, and the Rise of a New American Culture. Corporate power wasn’t especially popular in fin de siècle America, where strike-breaking industrial barons taught those without wealth to mistrust the ownership class. People were suspicious of new types of big business and protective of the small dry-goods stores run by members of their communities.
Department-store magnates alleviated these concerns by linking department stores to the public good. Retailers started inserting themselves into these communities as much as possible, Leach writes, turning their enormous stores into domains of urban civic life. They hosted free concerts and theatrical performances, offered free child care, displayed fine art, and housed restaurants, tearooms, Turkish baths, medical and dental services, banks, and post offices. They made splashy contributions to local charities and put on holiday parades and fireworks shows. This created the impression that patronizing their stores wouldn’t just be a practical transaction or an individual pleasure, but an act of benevolence toward the orderly society those stores supported.
With these goals in mind, Leach writes, customer service was born. For retailers’ tactics to be successful, consumers—or guests, as department stores of the era took to calling them—needed to feel appreciated and rewarded for their community-minded shopping sprees. So stores marshaled an army of workers: From 1870 to 1910, the number of service workers in the United States quintupled. It’s from this morass that “The customer is always right” emerged as the essential precept of American consumerism—service workers weren’t there just to ring up orders, as store clerks had done in the past. Instead, they were there to fuss and fawn, to bolster egos, to reassure wavering buyers, to make dreams come true. If a complaint arose, it was to be resolved quickly and with sincere apologies.
…
The efforts that Leach identified among turn-of-the-century department-store owners to paint their businesses as the true sites of popular democracy have been successful beyond what they probably could have imagined at the time. Most Americans now expect corporations to take a stand on contentious social and political issues; in return, corporations have even co-opted some of the language of actual politics, encouraging consumers to “vote with their dollars” for the companies that market themselves on the values closest to their own.
For Americans in a socially isolating culture, living under an all but broken political system, the consumer realm is the place where many people can most consistently feel as though they are asserting their agency. Most people in the United States don’t exactly have a plethora of opportunities to develop meaningful identities outside their economic station: Creative or athletic pursuits are generally cut off when people enter the workforce, fewer people attend religious services than in generations past, and loneliness and alienation are widespread. Americans work long hours, and many of those with disposable income earn it through what the anthropologist David Graeber calls “bullshit jobs”—the kind of empty spreadsheet-and-conference-call labor whose lack of real purpose and meaning, Graeber theorizes, is an ambient psychological stressor on the people performing it. What these jobs do provide, though, is income, the use of which can feel sort of like an identity.
This is not a feature of a healthy society. Even before the pandemic pushed things to further extremes, the primacy of consumer identity made customer-service interactions particularly conflagratory…
“American Shoppers Are a Nightmare“– and as Amanda Mull (@amandamull) explains, customers were nearly this awful long before the pandemic.
* Sophie Kinsella, Confessions of a Shopaholic
###
As we reconsider commerce, we might recall that it was on this date in 1939 that The Wizard of Oz premiered at the Strand Theater in Oconomowoc, Wisconsin– one of four Midwestern test screenings in advance of the Hollywood premier at Grauman’s Chinese Theater (on August 15).
Considered one the greats in the American film canon, it was of course based on the work of L. Frank Baum… who, before he created Dorothy and her adventures, was a retail pioneer. An accomplished window dresser (the equivalent at the turn of the 20th century of television commercial director), he founded and edited a magazine called The Show Window, later known as the Merchants Record and Show Window, which focused on store window displays, retail strategies, and visual merchandising; it’s still being published, now as VMSD.
“The heart and soul of the company is creativity”*…
Creativity doesn’t have a deep history. The Oxford English Dictionary records just a single usage of the word in the 17th century, and it’s religious: ‘In Creation, we have God and his Creativity.’ Then, scarcely anything until the 1920s – quasi-religious invocations by the philosopher A N Whitehead. So creativity, considered as a power belonging to an individual – divine or mortal – doesn’t go back forever. Neither does the adjective ‘creative’ – being inventive, imaginative, having original ideas – though this word appears much more frequently than the noun in the early modern period. God is the Creator and, in the 17th and 18th centuries, the creative power, like the rarely used ‘creativity’, was understood as divine. The notion of a secular creative ability in the imaginative arts scarcely appears until the Romantic Era, as when the poet William Wordsworth addressed the painter and critic Benjamin Haydon: ‘Creative Art … Demands the service of a mind and heart.’
This all changes in the mid-20th century, and especially after the end of the Second World War, when a secularised notion of creativity explodes into prominence. The Google Ngram chart bends sharply upwards from the 1950s and continues its ascent to the present day. But as late as 1970, practically oriented writers, accepting that creativity was valuable and in need of encouragement, nevertheless reflected on the newness of the concept, noting its absence from some standard dictionaries even a few decades before.
Before the Second World War and its immediate aftermath, the history of creativity might seem to lack its object – the word was not much in circulation. The point needn’t be pedantic. You might say that what we came to mean by the capacity of creativity was then robustly picked out by other notions, say genius, or originality, or productivity, or even intelligence – or whatever capacity it was believed enabled people to think thoughts considered new and valuable. And in the postwar period, a number of commentators did wonder about the supposed difference between emergent creativity and such other long-recognised mental capacities. The creativity of the mid-20th century was entangled in these pre-existing notions, but the circumstances of its definition and application were new…
Once seen as the work of genius, how did creativity become an engine of economic growth and a corporate imperative? (Hint: the Manhattan Project and the Cold War played important roles.): “The rise and rise of creativity.”
(Image above: source)
* Bob Iger, CEO of The Walt Disney Company
###
As we lionize the latest, we might recall that it was on this date in 1726 that Jonathan Swift’s Travels into Several Remote Nations of the World. In Four Parts. By Lemuel Gulliver, First a Surgeon, and then a Captain of Several Ships— much better known as Gulliver’s Travels— was first published. A satire both of human nature and of the “travelers’ tales” literary subgenre popular at the time, it was an immediate hit (John Gay wrote in a 1726 letter to Swift that “It is universally read, from the cabinet council to the nursery”). It has, of course, become a classic.

“My favorite special skill on my resume is ‘excellent monkey noises'”*…

If you walk into any bookstore or library in the world, you’re going to see dozens, possibly even hundreds of books about how to write a good résumé, how to structure it in a way that maximizes what you do best. Many will tell you to keep things under a page if you’re not above a certain age range; others will tell you that there’s nothing worse for making a first impression than a misplaced comma or repeated word.
But one thing that you likely will not find is a book that explains how to make a résumé that dates before 1970 or so. (Probably the first book on the topic with any long-lasting authority is Richard Bolles’ long-running What Color is Your Parachute? series, a self-help book that discourages the use of spray-and-pray résumé tactics.) Most of them will date to 1980 or beyond, in fact.
While both the résumé and the curriculum vitae existed before then and were frequently asked for in want ads as early as the late 1940s in some professional fields, something appears to have changed in their role starting in the late 1970s and early 1980s—around the time when many service-oriented fields first gained prominence—in which the résumé, particularly in North America, turned into a de facto requirement when applying for most new jobs.
Companies started treating humans as resources around this time, and many workers traded in their blue collars for white ones. It was a big shift, and the résumé was in the middle of it.
Why the name change, though? There are a lot of reasons why “résumé” won out over “application letter,” but I think one of the biggest might come from the education field of the era. The U.S. Department of Education’s Education Resources Information Center launched in 1965, and early in its life, relied on the terminology “document resume” to refer to its bibliographic entries, which are similar to résumés for people. This information reached schools through documents produced by the Education Department, and my theory is that the influence of this material on educators might just have touched the business world, too.
The shifting nature of work also made the need for more personalized applications more necessary. A 1962 book, Analyzing the Application for Employment, noted the overly complex nature of fill-in-the-blank application forms, and that they would often take hours for prospective employees to fill out. In the book, author Irwin Smalheiser of Personnel Associates highlights an example of one such person stuck dealing with complex application processes:
One man we know, who perpetually seems to be looking for work, has devised a neat system for coping with the application blanks he encounters. He has taken the time to complete a detailed summary of his work history which he carries in his wallet. When he is asked to fill out the company application form, he simply copies the pertinent dates and names of the companies for which he worked.
In many ways, a résumé solves this problem. While some level of modification comes with the work of sending out a résumé, you often can reuse it again and again without having to repeat your work. Sure, job applications stuck around for lower-end jobs, like fast food, but the résumé stuck around nearly everywhere else.
In a slower world, it was the best tool we had for applying for a new job. The problem is, the world got faster—and the model began to show its flaws…
The résumé, a document that largely gained prominence in the past half-century, was once a key part of getting a job. Soon, it might just disappear entirely. From the always-illuminating Ernie Smith, “Throw It In The Pile.”
###
As we boil it down (and spice it up), we might recall that it was on this date in 1834 that President Andrew Jackson sent federal troops to intervene in a labor dispute for the first time in U.S. history. Foreshadowing the notorious cases of federal military intervention in labor disputes during America’s Gilded Age, Jackson quashed labor unrest during the construction of the C&O Canal.
“History in its broadest aspect is a record of man’s migrations from one environment to another”*…
email readers click here for video
All roads lead from Rome, according to a visual history of human culture built entirely from the birth and death places of notable people. The 5-minute animation provides a fresh view of the movements of humanity over the last 2,600 years.
Maximilian Schich, an art historian at the University of Texas at Dallas, and his colleagues used the Google-owned knowledge base, Freebase, to find 120,000 individuals who were notable enough in their life-times that the dates and locations of their births and deaths were recorded.
The list includes people ranging from Solon, the Greek lawmaker and poet, who was born in 637 bc in Athens, and died in 557 bc in Cyprus, to Jett Travolta — son of the actor John Travolta — who was born in 1992 in Los Angeles, California, and died in 2009 in the Bahamas.
The team used those data to create a movie that starts in 600 bc and ends in 2012…
Learn more (e.g., that more architects than artists died in the French Revolution) at Nature.
* Ellsworth Huntington
###
As we take the long view, we might recall that on this date in 1597, the Hanseatic League (a northern European confederation that was a forerunner of Germany) expelled all English merchants. The expulsion was a product of on-going tensions with English and Dutch trading interests, and a direct response to Elizabeth I’s closure of the Steelyard, the League’s trading post in London.

The Hanseatic League





You must be logged in to post a comment.