Posts Tagged ‘labor’
“The roots of the word ‘compete’ are the Latin con petire, which meant ‘to seek together'”*…
Peter Thiel would have us believe that “competition is for losers.” The FTC begs to differ. Earlier this month, it introduced a proposed rule that would eliminate employee noncompete agreements. Scott Galloway explains why this is a very good plan…
… Yesterday’s iconoclasts pull the ladder up behind them the moment they become today’s icons. We’re in general agreement that “anti-competitive” behavior is bad, and have laws against it. Yet companies have been able to convince regulators to look the other way on an increasingly popular weapon of mass entrenchment. They’re passing out OxyContin during an AA meeting. The Oxy? Noncompete agreements…
Noncompete clauses are what firms use to sequester your human capital from competitors. When a new employee signs a noncompete with, say, Johnson & Johnson, they agree that when their employment ends, they won’t work at another pharmaceutical company for a designated period — usually one to two years. If you’re familiar with noncompetes, you likely associate them with technology jobs, where employers want to protect valuable intellectual property. And that’s the defense most often offered for the restrictions. BTW, the argument is bullshit … a confidentiality agreement does the trick.
The irony of noncompetes is they only serve to dampen growth. One of the few places where they’re banned is also home to the world’s most innovative tech economy: California. Job-hopping and seeding new acorns have been part of Silicon Valley since the beginning. In 1994 a Berkeley economist theorized that California’s ban on noncompetes was one of the main reasons Silicon Valley existed at all, and in 2005, economists at the Federal Reserve put forward statistical evidence supporting the theory. Apple, Disney, Google, Intel, Meta, Netflix, Oracle, and Tesla were able to succeed without limiting the options of their employees.
Yet outside California, corporate boardrooms love noncompetes. Historically they were attached only to high-skilled, high-paying jobs. Now they’re becoming ubiquitous across different industries at all levels. Fast-food workers are being forced to sign noncompetes, as are hairstylists and security guards. Roughly a third of minimum wage jobs in America now require such agreements. If forcing noncompetes on America’s lowest-paid workers sounds like indentured servitude, trust your instincts.
Employers claim noncompetes give them the assurance to pay for training and other investments in their employees. There is some evidence that noncompetes are associated with more worker training. But there’s a catch: They also decrease wages. The good news is we’ll train you to operate the fryer, the bad news is we won’t pay you a living wage to do it — and you can’t take a better job across the street.
The FTC estimates that noncompetes reduce employment opportunities for 30 million people and suppress wages by $300 billion per year. That’s far more than the total value of property stolen outright every year. Multiple studies also show that noncompetes reduce entrepreneurship and business formation. Which makes sense — it’s difficult to start a business when talent pools are not accessible or allocated to their best use. Downstream, the lack of competition leads to entrenchment, which eventually results in higher prices for consumers — as one study found has occurred in health care. Everybody loses. Except, of course, the incumbent’s shareholders…
It’s worth remembering the insight of W. Edwards Deming, one of the architects of Japan’s rise to industrial leadership after World War II: “In 1945, the world was in a shambles. American companies had no competition. So nobody really thought much about quality. Why should they? The world bought everything America produced. It was a prescription for disaster.”
The case against noncompete agreements: “Compete,” from @profgalloway.
See also: “Noncompete Agreements Reduce Worker Pay — and Overall Economic Activity” (source of the image above).
* Mihaly Csikszentmihalyi
###
As we remove the shackles, we might recall that it was on this date in 1915 that Ralph Chaplin, a Wobblie (a member of the Industrial Workers of the World) finished his poem “Solidarity Forever“– which, sung to the tune of “John Brown’s Body”/The Battle Hymn of the Republic,” has become a labor movement anthem.
“A fair day’s-wage for a fair day’s work: it is as just a demand as governed men ever made of governing.”*…
As low-wage employers struggle to find workers, it seems as that labor– which has been left behind over the last several decades, as the economic benefits of growth have flowed to executives and owners– may be about to have its day. But will it? And what might that mean?
In her first statement as Treasury Secretary, Janet Yellen said that the United States faced “an economic crisis that has been building for fifty years.” The formulation is intriguing but enigmatic. The last half century is piled so high with economic wreckage that it is not obvious how to name the long crisis, much less how to pull the fragments together into a narrative. One place to start is with the distribution of national income between labor and capital (or, looked at another way, between the wage share and the profit share of national income). About fifty years ago, the share of income going to labor began to decline, forming a statistical record of the epochal collapse of working class power. Episodes of high employment in the 1990s and the late 2010s did not reverse the long-term pattern. Even today, with a combination of easy money and fiscal stimulus unprecedented since World War II, it is unclear what it would take to reverse the trend in distribution.
Few would seriously dispute that hawkish Federal Reserve policies have played a direct role in the decline of the labor share since the 1970s. This is the starting point for thinking about monetary policy and the income distribution, but many questions remain. Today’s expansionary program extends beyond monetary policy to include fiscal stimulus and even industrial policy, but the first sign of an elite rethinking was the Fed’s dovish turn around 2016. (The Fed chair then was Yellen, whose current tenure as Treasury Secretary has been marked by close coordination with her successor, Jerome Powell.) In a fundamental sense, the entire Biden program hangs on the Fed: low interest rates made possible a reevaluation of the cost of massive government debt, which has in turn opened new horizons for a would-be activist government.
If the age of inequality was the product of a hawkish Fed, could a dovish central bank reverse the damage? Today, there is more reason to speak of a “pro-labor turn” than perhaps at any time over the last half century. But history is not so easily reversed. The new policy regime is not a simple course correction to decades of misguided neoliberalism. There is evidence that the current experiment was made possible by a recognition that workers had suffered a secular defeat—specifically, that they had lost the ability to increase or even defend their share of the national income. What would happen if labor became stronger?…
Tim Barker (@_TimBarker) explores: “Preferred Shares,” in Phenomenal World (@WorldPhenomenal).
On a related note: “The economics of dollar stores.”
[Image above: source]
* Thomas Carlyle
###
As we re-slice the pie, we might send acquisitive birthday greetings to Claude-Frédéric Bastiat; he was born on this date in 1801 (though some sources give tomorrow as his birthday). An economist and writer, he was a prominent member of the French Liberal School. As an advocate of classical economics and the views of Adam Smith, his advocacy for free markets influenced the Austrian School; indeed, Joseph Schumpeter called him “the most brilliant economic journalist who ever lived”… which is to say that Bastiat was a father of the neo-liberal economic movement that’s been central to creating the situation we’re in.
“My favorite special skill on my resume is ‘excellent monkey noises'”*…

If you walk into any bookstore or library in the world, you’re going to see dozens, possibly even hundreds of books about how to write a good résumé, how to structure it in a way that maximizes what you do best. Many will tell you to keep things under a page if you’re not above a certain age range; others will tell you that there’s nothing worse for making a first impression than a misplaced comma or repeated word.
But one thing that you likely will not find is a book that explains how to make a résumé that dates before 1970 or so. (Probably the first book on the topic with any long-lasting authority is Richard Bolles’ long-running What Color is Your Parachute? series, a self-help book that discourages the use of spray-and-pray résumé tactics.) Most of them will date to 1980 or beyond, in fact.
While both the résumé and the curriculum vitae existed before then and were frequently asked for in want ads as early as the late 1940s in some professional fields, something appears to have changed in their role starting in the late 1970s and early 1980s—around the time when many service-oriented fields first gained prominence—in which the résumé, particularly in North America, turned into a de facto requirement when applying for most new jobs.
Companies started treating humans as resources around this time, and many workers traded in their blue collars for white ones. It was a big shift, and the résumé was in the middle of it.
Why the name change, though? There are a lot of reasons why “résumé” won out over “application letter,” but I think one of the biggest might come from the education field of the era. The U.S. Department of Education’s Education Resources Information Center launched in 1965, and early in its life, relied on the terminology “document resume” to refer to its bibliographic entries, which are similar to résumés for people. This information reached schools through documents produced by the Education Department, and my theory is that the influence of this material on educators might just have touched the business world, too.
The shifting nature of work also made the need for more personalized applications more necessary. A 1962 book, Analyzing the Application for Employment, noted the overly complex nature of fill-in-the-blank application forms, and that they would often take hours for prospective employees to fill out. In the book, author Irwin Smalheiser of Personnel Associates highlights an example of one such person stuck dealing with complex application processes:
One man we know, who perpetually seems to be looking for work, has devised a neat system for coping with the application blanks he encounters. He has taken the time to complete a detailed summary of his work history which he carries in his wallet. When he is asked to fill out the company application form, he simply copies the pertinent dates and names of the companies for which he worked.
In many ways, a résumé solves this problem. While some level of modification comes with the work of sending out a résumé, you often can reuse it again and again without having to repeat your work. Sure, job applications stuck around for lower-end jobs, like fast food, but the résumé stuck around nearly everywhere else.
In a slower world, it was the best tool we had for applying for a new job. The problem is, the world got faster—and the model began to show its flaws…
The résumé, a document that largely gained prominence in the past half-century, was once a key part of getting a job. Soon, it might just disappear entirely. From the always-illuminating Ernie Smith, “Throw It In The Pile.”
###
As we boil it down (and spice it up), we might recall that it was on this date in 1834 that President Andrew Jackson sent federal troops to intervene in a labor dispute for the first time in U.S. history. Foreshadowing the notorious cases of federal military intervention in labor disputes during America’s Gilded Age, Jackson quashed labor unrest during the construction of the C&O Canal.







You must be logged in to post a comment.