(Roughly) Daily

Posts Tagged ‘business history

“They said I was a valued customer, now they send me hate mail”*…

Is shopping therapy… or an occasion for therapy?…

… Throughout the coronavirus pandemic, videos of irate anti-maskers screaming, throwing things, and assaulting employees at big-box and grocery stores have become a social-media mainstay. As Americans return en masse to more types of in-person commerce, the situation only seems to be declining. At its most violent extreme, workers have been hospitalized or killed. Eight Trader Joe’s employees were injured in one such attack in New York, and in Georgia, a grocery-store cashier was shot over a mask dispute. Far more frequent are the accounts of short-fused shoppers becoming verbally abusive or otherwise degrading over slow service or sold-out goods. Earlier this month, a restaurant on Cape Cod reportedly was so overwhelmed with rude customers that it shut down for a “day of kindness.

America’s ultra-tense political climate, together with the accumulated personal and economic traumas of the pandemic, have helped spur this animosity, which was already intense and common in the United States. But it’s hardly the only reason that much of the country has decided to take out its pandemic frustrations on the customer-service desk. For generations, American shoppers have been trained to be nightmares. The pandemic has shown just how desperately the consumer class clings to the feeling of being served.

The experience of buying a new television or a double cheeseburger in a store has gotten worse in your lifetime. It’s gotten worse for the people selling TVs and burgers too. The most immediate culprit is decades of cost-cutting; by increasing surveillance and pressure on workers during shifts, reducing their hours and benefits, and not replacing those who quit, executives can shine up a business’s balance sheet in a hurry. Sometimes, you can see these shifts happening in real time, as with pandemic-era QR-code-ordering in restaurants, which allows them to reduce staff—and which is likely to stick around. Wages and resources dwindle, and more expensive and experienced workers get replaced with fewer and more poorly trained new hires. When customers can’t find anyone to help them or have to wait too long in line, they take it out on whichever overburdened employee they eventually hunt down.

This dynamic is exacerbated by the fact that the United States has more service workers than ever before, doing more types of labor, spread thin across the economy—Uber drivers; day-care workers; hair stylists; call-center operators; DoorDash “dashers”; Instacart shoppers; home health aides; Amazon’s fleet of delivery people, with your cases of toilet paper and new pajamas in the trunk of their own car. In 2019, one in five American workers was employed in retail, food service, or hospitality; even more are now engaged in service work of some kind.

For people currently alive and shopping in America, this economic arrangement is so all-encompassing that it can feel like the natural order of things. But customer service as a concept is an invention of the past 150 years. At the dawn of the second Industrial Revolution, most people grew or made much of what they used themselves; the rest came from general stores or peddlers. But as the production of food and material goods centralized and rapidly expanded, commerce reached a scale that the country’s existing stores were ill-equipped to handle, according to the historian Susan Strasser, the author of Satisfaction Guaranteed: The Making of the American Mass Market. Manufacturers needed ways to distribute their newly enormous outputs and educate the public on the wonder of all their novel options. Americans, in short, had to be taught how to shop.

In this void grew department stores, the very first of which appeared in the United States in the 1820s. The model proliferated in cities as the 20th century neared and industrial manufacturing expanded. By consolidating sales under corporate auspices in much the same way that factories consolidated production, businesses such as Wanamaker’s, Macy’s, and Marshall Field’s hinted at the astonishing ways American life would change over the next century. But consolidation also created a public-image issue, argues the historian William Leach in Land of Desire: Merchants, Power, and the Rise of a New American Culture. Corporate power wasn’t especially popular in fin de siècle America, where strike-breaking industrial barons taught those without wealth to mistrust the ownership class. People were suspicious of new types of big business and protective of the small dry-goods stores run by members of their communities.

Department-store magnates alleviated these concerns by linking department stores to the public good. Retailers started inserting themselves into these communities as much as possible, Leach writes, turning their enormous stores into domains of urban civic life. They hosted free concerts and theatrical performances, offered free child care, displayed fine art, and housed restaurants, tearooms, Turkish baths, medical and dental services, banks, and post offices. They made splashy contributions to local charities and put on holiday parades and fireworks shows. This created the impression that patronizing their stores wouldn’t just be a practical transaction or an individual pleasure, but an act of benevolence toward the orderly society those stores supported.

With these goals in mind, Leach writes, customer service was born. For retailers’ tactics to be successful, consumers—or guests, as department stores of the era took to calling them—needed to feel appreciated and rewarded for their community-minded shopping sprees. So stores marshaled an army of workers: From 1870 to 1910, the number of service workers in the United States quintupled. It’s from this morass that “The customer is always right” emerged as the essential precept of American consumerism—service workers weren’t there just to ring up orders, as store clerks had done in the past. Instead, they were there to fuss and fawn, to bolster egos, to reassure wavering buyers, to make dreams come true. If a complaint arose, it was to be resolved quickly and with sincere apologies.

The efforts that Leach identified among turn-of-the-century department-store owners to paint their businesses as the true sites of popular democracy have been successful beyond what they probably could have imagined at the time. Most Americans now expect corporations to take a stand on contentious social and political issues; in return, corporations have even co-opted some of the language of actual politics, encouraging consumers to “vote with their dollars” for the companies that market themselves on the values closest to their own.

For Americans in a socially isolating culture, living under an all but broken political system, the consumer realm is the place where many people can most consistently feel as though they are asserting their agency. Most people in the United States don’t exactly have a plethora of opportunities to develop meaningful identities outside their economic station: Creative or athletic pursuits are generally cut off when people enter the workforce, fewer people attend religious services than in generations past, and loneliness and alienation are widespread. Americans work long hours, and many of those with disposable income earn it through what the anthropologist David Graeber calls “bullshit jobs”—the kind of empty spreadsheet-and-conference-call labor whose lack of real purpose and meaning, Graeber theorizes, is an ambient psychological stressor on the people performing it. What these jobs do provide, though, is income, the use of which can feel sort of like an identity.

This is not a feature of a healthy society. Even before the pandemic pushed things to further extremes, the primacy of consumer identity made customer-service interactions particularly conflagratory…

American Shoppers Are a Nightmare“– and as Amanda Mull (@amandamull) explains, customers were nearly this awful long before the pandemic.

* Sophie Kinsella, Confessions of a Shopaholic

###

As we reconsider commerce, we might recall that it was on this date in 1939 that The Wizard of Oz premiered at the Strand Theater in Oconomowoc, Wisconsin– one of four Midwestern test screenings in advance of the Hollywood premier at Grauman’s Chinese Theater (on August 15).

Considered one the greats in the American film canon, it was of course based on the work of L. Frank Baum… who, before he created Dorothy and her adventures, was a retail pioneer. An accomplished window dresser (the equivalent at the turn of the 20th century of television commercial director), he founded and edited a magazine called The Show Window, later known as the Merchants Record and Show Window, which focused on store window displays, retail strategies, and visual merchandising; it’s still being published, now as VMSD.

Back Camera

source

“The heart and soul of the company is creativity”*…

Creativity doesn’t have a deep history. The Oxford English Dictionary records just a single usage of the word in the 17th century, and it’s religious: ‘In Creation, we have God and his Creativity.’ Then, scarcely anything until the 1920s – quasi-religious invocations by the philosopher A N Whitehead. So creativity, considered as a power belonging to an individual – divine or mortal – doesn’t go back forever. Neither does the adjective ‘creative’ – being inventive, imaginative, having original ideas – though this word appears much more frequently than the noun in the early modern period. God is the Creator and, in the 17th and 18th centuries, the creative power, like the rarely used ‘creativity’, was understood as divine. The notion of a secular creative ability in the imaginative arts scarcely appears until the Romantic Era, as when the poet William Wordsworth addressed the painter and critic Benjamin Haydon: ‘Creative Art … Demands the service of a mind and heart.’

This all changes in the mid-20th century, and especially after the end of the Second World War, when a secularised notion of creativity explodes into prominence. The Google Ngram chart bends sharply upwards from the 1950s and continues its ascent to the present day. But as late as 1970, practically oriented writers, accepting that creativity was valuable and in need of encouragement, nevertheless reflected on the newness of the concept, noting its absence from some standard dictionaries even a few decades before.

Before the Second World War and its immediate aftermath, the history of creativity might seem to lack its object – the word was not much in circulation. The point needn’t be pedantic. You might say that what we came to mean by the capacity of creativity was then robustly picked out by other notions, say genius, or originality, or productivity, or even intelligence or whatever capacity it was believed enabled people to think thoughts considered new and valuable. And in the postwar period, a number of commentators did wonder about the supposed difference between emergent creativity and such other long-recognised mental capacities. The creativity of the mid-20th century was entangled in these pre-existing notions, but the circumstances of its definition and application were new…

Once seen as the work of genius, how did creativity become an engine of economic growth and a corporate imperative? (Hint: the Manhattan Project and the Cold War played important roles.): “The rise and rise of creativity.”

(Image above: source)

* Bob Iger, CEO of The Walt Disney Company

###

As we lionize the latest, we might recall that it was on this date in 1726 that Jonathan Swift’s Travels into Several Remote Nations of the World. In Four Parts. By Lemuel Gulliver, First a Surgeon, and then a Captain of Several Ships— much better known as Gulliver’s Travels— was first published.  A satire both of human nature and of the “travelers’ tales” literary subgenre popular at the time, it was an immediate hit (John Gay wrote in a 1726 letter to Swift that “It is universally read, from the cabinet council to the nursery”).  It has, of course, become a classic.

From the first edition

source

Written by (Roughly) Daily

October 28, 2020 at 1:01 am

“My favorite special skill on my resume is ‘excellent monkey noises'”*…

 

0123_resume

 

If you walk into any bookstore or library in the world, you’re going to see dozens, possibly even hundreds of books about how to write a good résumé, how to structure it in a way that maximizes what you do best. Many will tell you to keep things under a page if you’re not above a certain age range; others will tell you that there’s nothing worse for making a first impression than a misplaced comma or repeated word.

But one thing that you likely will not find is a book that explains how to make a résumé that dates before 1970 or so. (Probably the first book on the topic with any long-lasting authority is Richard Bolles’ long-running What Color is Your Parachute? series, a self-help book that discourages the use of spray-and-pray résumé tactics.) Most of them will date to 1980 or beyond, in fact.

While both the résumé and the curriculum vitae existed before then and were frequently asked for in want ads as early as the late 1940s in some professional fields, something appears to have changed in their role starting in the late 1970s and early 1980s—around the time when many service-oriented fields first gained prominence—in which the résumé, particularly in North America, turned into a de facto requirement when applying for most new jobs.

Companies started treating humans as resources around this time, and many workers traded in their blue collars for white ones. It was a big shift, and the résumé was in the middle of it.

Why the name change, though? There are a lot of reasons why “résumé” won out over “application letter,” but I think one of the biggest might come from the education field of the era. The U.S. Department of Education’s Education Resources Information Center launched in 1965, and early in its life, relied on the terminology “document resume” to refer to its bibliographic entries, which are similar to résumés for people. This information reached schools through documents produced by the Education Department, and my theory is that the influence of this material on educators might just have touched the business world, too.

The shifting nature of work also made the need for more personalized applications more necessary. A 1962 book, Analyzing the Application for Employment, noted the overly complex nature of fill-in-the-blank application forms, and that they would often take hours for prospective employees to fill out. In the book, author Irwin Smalheiser of Personnel Associates highlights an example of one such person stuck dealing with complex application processes:

One man we know, who perpetually seems to be looking for work, has devised a neat system for coping with the application blanks he encounters. He has taken the time to complete a detailed summary of his work history which he carries in his wallet. When he is asked to fill out the company application form, he simply copies the pertinent dates and names of the companies for which he worked.

In many ways, a résumé solves this problem. While some level of modification comes with the work of sending out a résumé, you often can reuse it again and again without having to repeat your work. Sure, job applications stuck around for lower-end jobs, like fast food, but the résumé stuck around nearly everywhere else.

In a slower world, it was the best tool we had for applying for a new job. The problem is, the world got faster—and the model began to show its flaws…

The résumé, a document that largely gained prominence in the past half-century, was once a key part of getting a job.  Soon, it might just disappear entirely.  From the always-illuminating Ernie Smith, “Throw It In The Pile.”

* Ciara Renee

###

As we boil it down (and spice it up), we might recall that it was on this date in 1834 that President Andrew Jackson sent federal troops to intervene in a labor dispute for the first time in U.S. history.  Foreshadowing the notorious cases of federal military intervention in labor disputes during America’s Gilded Age, Jackson quashed labor unrest during the construction of the C&O Canal.

AJackson source

 

Written by (Roughly) Daily

January 29, 2020 at 1:01 am

“History in its broadest aspect is a record of man’s migrations from one environment to another”*…

email readers click here for video

All roads lead from Rome, according to a visual history of human culture built entirely from the birth and death places of notable people. The 5-minute animation provides a fresh view of the movements of humanity over the last 2,600 years.

Maximilian Schich, an art historian at the University of Texas at Dallas, and his colleagues used the Google-owned knowledge base, Freebase, to find 120,000 individuals who were notable enough in their life-times that the dates and locations of their births and deaths were recorded.

The list includes people ranging from Solon, the Greek lawmaker and poet, who was born in 637 bc in Athens, and died in 557 bc in Cyprus, to Jett Travolta — son of the actor John Travolta — who was born in 1992 in Los Angeles, California, and died in 2009 in the Bahamas.

The team used those data to create a movie that starts in 600 bc and ends in 2012…

Learn more (e.g., that more architects than artists died in the French Revolution) at Nature.

* Ellsworth Huntington

###

As we take the long view, we might recall that on this date in 1597, the Hanseatic League (a northern European confederation that was a forerunner of Germany) expelled all English merchants.  The expulsion was a product of on-going tensions with English and Dutch trading interests, and a direct response to Elizabeth I’s closure of the Steelyard, the League’s trading post in London.

The Hanseatic League

source

 

Written by (Roughly) Daily

August 12, 2014 at 1:01 am

“If you want to change the culture, you will have to start by changing the organization”*…

 

But to change it, you have to know what that organization is…

 click here, and again on the image, for larger version

With this 1855 chart, Daniel McCallum, general superintendent of the New York and Erie Railroad, tried to define an organizational structure that would allow management of a business that was becoming unwieldy in its size. The document is generally recognized to be the first formal organizational chart.

Historian Caitlin Rosenthal, writing in the McKinsey Quarterlypoints out that the chart was a way for McCallum to get a handle on a complex system made more confusing by the new availability of data from the use of the telegraph (invented in 1844). Information about problems down the track was important to have—it could help prevent train wrecks and further delays—but the New York and Erie’s personnel didn’t have a good sense of who was in charge of managing this data and putting it into action…

Read the whole story in the ever-illuminating Rebecca Onion’s “The First Modern Organizational Chart Is a Thing of Beauty.”

* Anthropologist Mary Douglas

###

As we grapple with grapple with the Great Chain of Being, we might recall that it was on this date in 1937 that General Motors formally recognized the United Auto Workers as the collective bargaining representatives of GM workers.  The decision came on the heels of a 44-day sit-down strike that had begun in December, 1936, and that had idled 48,000 employees.  Still (to Dr. Douglas’ point), old habits die hard: two month later GM guards assaulted and beat UAW leaders at the company’s Rouge River plant.

 source

 

 

%d bloggers like this: