Posts Tagged ‘corporations’
“Humanity is actually much more cooperative and empathic than given credit for”*…
We looked earlier at the shrinking away of public companies in the U.S., both as a product of consolidation (of operations and of ownership) and of the (potentially dangerous) growth, in their stead, of private equity. University of Michigan professor Jerry Davis has a more optimistic take…
Public corporations have been dominant institutions in the American economy since the dawn of the 20th century. Whether due to their greater efficiency or power, listed corporations spread across nearly all industries. “Capitalism” in America was synonymous with “corporate capitalism,” and the number of exchange-listed companies grew with the size of the economy.
Yet since the late 1990s, the number of listed corporations has dropped by half in the US, underwritten by new technologies that lower the cost of assembling an enterprise. Meanwhile, neglected alternatives to the public corporation both old (e.g., mutuals, cooperatives) and new (e.g., open source, platform coops) have proven surprisingly durable. Given the manifest pathologies of shareholder capitalism, the combination of these two trends may suggest pathways out of our current dilemma…
[David explains how both consolidation among listed companies and the rise of private equity have contributed to this drop, but then raises a third, more general explanation…]
A more encompassing interpretation is that information and communication technologies (ICTs) have drastically changed the basic economic calculus of what an enterprise looks like and how it might be funded. In the US context, this has meant that companies prefer “buy” to “make,” as transaction cost enthusiasts might describe it. I coined the term Nikefication to describe the process of vertical dis-integration that reconfigured American industry during the 1990s and 2000s and the options it opens for alternative forms of enterprise, described in detail in previous books…
The vertical dis-integration of the American economy was driven by Wall Street and enabled by ICTs. Ironically, the result is that the capital requirements to create and scale a business can be much lower, reducing the rationale to go public in the first place. Indeed, IPO prospectuses routinely convey that the point of the IPO is not to raise capital, but to create a market for the company’s shares to enable VCs and employees to cash out – which is not the most persuasive pitch to potential buyers, and perhaps helps account for the disastrous post-IPO performance of most new listings.
The asset-lite model means fewer public companies, but it also suggests new possibilities for non-corporate forms that may be more human-scale and democratic. Nike’s profit-driven, asset- and employee-lite model is not the only option enabled by new technologies.
By “noncorporate” I mean forms of economic organization that are not owned by outside shareholders, although they may be legally organized as a corporation. These include mutuals (where consumers or members are also the owners); cooperatives (where workers, producers, or consumers are the owners); municipal enterprises (where citizens or governments own the enterprise); nonprofits; and open source projects. These forms are far more prevalent than one might expect, and in some cases they dominate their industry (e.g., property insurance, server software).
Noncorporate forms of enterprise have proven surprisingly resilient in the US. The Fortune 500 list for 2022 includes at least a dozen mutual insurance companies, including State Farm (#44), New York Life (#71), and Nationwide (#83). The single largest shareholder of over 350 of the 1000 largest American corporations is Vanguard—also a mutual. Land o’ Lakes (#213) is an agricultural cooperative owned by its producer-members, as are Ocean Spray and Blue Diamond. Ace Hardware is a retail cooperative in which local stores can be attuned to local needs and tastes yet gain the economies of scale of a large-scale brand. Jessica Gordon Nembhard’s brilliant book Collective Courage documents that cooperative forms thrived in African-American communities for generations – often overlooked by those who find data about the economy solely through online databases. And the US is home to nearly 5000 credit unions, which by law are not-for-profits, owned by their members.
Stanford Law professor Ron Gilson once quipped that if shareholders didn’t exist, they would have to be invented. That’s not quite true: plenty of American enterprises do quite well without shareholders. Indeed, civilization itself might be better without them. As I have written elsewhere, “nearly every major societal pathology in the West today – certainly in the USA – is caused or exacerbated by profit-oriented corporations,” including the opioid epidemic, the obesity crisis, the return of nicotine addiction among the young, democracy-undermining social media, and a climate catastrophe underwritten by the fossil fuel industry. Shareholder capitalism may be a suicide pact. Conversely, cooperatives are inherently democratic and accountable…
Institutional alternatives to public corporations are well-established in the US, and in some cases they lead their industry, such as mutuals in finance and insurance. But cooperatives have historically been thin on the ground here compared to Europe. According to the Democracy At Work Initiative, there were 612 worker cooperatives in 2021 –a 30% increase over 2019, but still a tiny number.
Perhaps the digital revolution has finally created the conditions for cooperatives to thrive. Research from the pre-digital era suggests that one of the factors limiting cooperatives is, for want of a better term, the transaction costs of democracy. A lot of workers’ time spent in meetings to engage in dialogue, debate, and polling is a price that corporate dictatorships don’t have to bear. But newer tools have dramatically reduced the transaction costs of democracy: the same smartphones that enable pervasive corporate surveillance also allow worker voice at scale on a continuous basis.
It is not just transaction costs that have declined: the required assets to start a business are also much cheaper now to own or rent. Capital equipment such as Computer Numerical Control tools, powered by software, gets better and cheaper much the same way other software-powered tools do. (Compare the price of a color laser printer in 1990 to one today.) This is also true of the software required to run an enterprise. It is possible to buy a knockoff version of the enterprise software underlying the Uber app for under $10,000 – and the Drivers Coop in New York is creating a version to “franchise” the locavore driver-owned coop alternative to Uber. The ICTs that dis-integrated the corporate economy have opened space for noncorporate alternatives that might be more democratic and human-scaled.
There are reasons for optimism here. Platform cooperatives merge the benefits of coops with accessible technology, and have been especially effective in industries in which the required new capital investment is low (home cleaning, home health aides, transit). Trebor Scholz’s new book Own This! provides details on the opportunities here. Municipally- or cooperative-owned fabrication facilities can enable enterprises with limited capital to launch and thrive. If the required investment to start a business is low, then the range of alternative institutions, including coops, is correspondingly larger.
The technologies exist to create low-cost alternatives to public corporations. Maybe we are not stuck with the legacy of 20th century corporate capitalism after all…
An optimistic (and aspirational) take on what might follow the economic reign of the public company: “Is This the End of Corporate Capitalism?” from @vanishingcorp via @iftf.
###
As we ponder proprietorship, we might recall that, on this date in 1933 the hospitality industry got a boost as Congress ratified the 21st Amendment to the U.S. Constitution– repealing the 18th Amendment, which had prohibited the manufacture, transportation, and sale of alcohol. Prohibition had gone into effect in 1920 in an effort to reduce crime and improve public health, but it had backfired: despite massive public investment in enforcement, there was a sharp rise in organized crime (c.f.: bootleggers like Al Capone stepping in to supply black market booze) and the emergence of a “scofflaw” attitude on the part of a public that wanted its alcohol.
“A suburb is an attempt to get out of reach of the city without having the city be out of reach”*…

In mid-twentieth century, in contrast to the noisy and diverse city, the suburbs were seen as spacious, segregated, and quiet— a much more promising state of affairs to corporations bent on expansion. American cities had been spreading out into metropolitan areas since the 19th century; but for most of that time city centers remained the hub of economic and social life. As Luise A. Mozingo explains, that began to change after World War II; residents and businesses alike began to leave…
… As a number of scholars have emphasized, the iconic suburbs of white, middle-class, nuclear families were a well-known part of this story but by no means all of it. Added to prewar suburban expansion, the rapid restructuring of postwar metropolitan areas formed a complexity of patches, spokes, and swaths of separated, specialized, and low-density land uses in the peripheral zones around older city centers, including industry, retail centers, ethnic enclaves, and working-class neighborhoods. This rapid decentralization created the conditions that were conducive to the invention of specialized suburban management facilities by large corporations.
To many privileged Americans of the 1950s and 1960s, the center city appeared to be in a state of inexorable decline. The proliferating automobile inundated the center city’s gridded 19th-century street pattern, and “congestion” seemed intractable and highly detrimental to economic activity. Increasing numbers of people of color walked the streets. Vacancies and abandoned properties were on the rise as tenants relocated to the suburbs and owners could find no replacements. New construction in the city center required homage to an ensconced and layered system of political patronage. Even then, wedging in new skyscrapers that could accommodate large corporate staffs in a single building proved difficult in blocks divided into multiple parcels of land and built out with varied buildings, including many used for industry. To redress these perceived shortcomings, the urban renewal process acquired property, removed tenants, destroyed buildings, and reparceled land in order to insert freeways, offer large lots for corporate offices, supply parking, and confine the poor to mass public housing. In the process, it took apart what remained of the vitality of the old urban core and added to the inventory of open urban lots and dysfunctional neighborhoods. The center city was noisy, diverse, crowded, unpredictable, inflexible, expensive, old, and messy — a dubious state of affairs for postwar capitalists bent on expansion.
In contrast, the suburbs seemed to warrant a sense of forward-looking optimism. At the city’s edge, an effective alliance of well-financed real estate investors, large property owners, local governments, federal loan guarantors, and utopian planners opened property for speedy development. Building along federal- and state-funded road systems that brought these large tracts of land into the economy of metropolitan regions, this alliance conceived of low-density, auto-accessed landscapes of highly specified uses with plenty of parking, and wrote these forms into stringent zoning and building regulations. Once built, these suburban expansion zones were deliberately resistant to change, with the end of producing both social stasis and secure real estate values.
The suburbs as a whole may have been diverse, but the process of building their component parts created insidious racial and class divisions. While the separation of different classes and races of home dwellers is the best-understood part of this spatial process, all kinds of workers were categorically set apart in discrete landscapes as well — corporate executives from factory labor, retail clerks from typists, electronics researchers from accountants. Hence the suburbs were predictable, spacious, segregated, specialized, quiet, new, and easily traversed — a much more promising state of affairs to corporations bent on expansion.
…
My book “Pastoral Capitalism” describes how pioneering projects established the essential landscape patterns of the corporate campus, corporate estate, and office park and how, from those few early projects, other corporations followed suit in great numbers. These landscape types became embedded in the expectations of the corporate class and could, at a glance, embody both the reality and prospect of capitalist power. Hence, the development forms have remained remarkably consistent for six decades. By the end of the 20th century, the suburbs, not the central business district, contained the majority of office space in the United States. This was a new and potent force in the process of suburban expansion…
More at “The Birth of the Pastoral Corporation.”
###
As we ponder the prominence of the periphery, we might send altitudinous birthday greetings to Louis Sullivan; he was born on this date in 1856. An architect, he was hugely influential in the Chicago School, a mentor to Frank Lloyd Wright, and an inspiration to the Chicago group of architects who have come to be known as the Prairie School. He is considered by many to have been the “father of modernism” in architecture (the phrase “form follows function” is attributed to him) and (as he pioneered the steel high-rise) “the father of the skyscraper.”
Indeed, in Sullivan’s honor, this date is National Skyscraper Day.
“People in any organization are always attached to the obsolete – the things that should have worked but did not, the things that once were productive and no longer are”*…
Ed Zitron argues that America has too many managers, and managers misbehaving at that…
In a 2016 Harvard Business Review analysis, two writers calculated the annual cost of excess corporate bureaucracy as about $3 trillion, with an average of one manager per every 4.7 workers. Their story mentioned several case studies—a successful GE plant with 300 technicians and a single supervisor, a Swedish bank with 12,000 workers and three levels of hierarchy—that showed that reducing the number of managers usually led to more productivity and profit. And yet, at the time of the story, 17.6 percent of the U.S. workforce (and 30 percent of the workforce’s compensation) was made up of managers and administrators—an alarming statistic that shows how bloated America’s management ranks had become.
The United States, more than anywhere else in the world, is addicted to the concept of management. As I’ve written before, management has become a title rather than a discipline. We have a glut of people in management who were never evaluated on their ability to manage before being promoted to their role. We have built corporate America around the idea that if you work hard enough, one day you might become a manager, someone who makes rather than takes orders. While this is not the only form of management, based on the response to my previous article and my newsletters on the subject, this appears to be how many white-collar employees feel. Across disparate industries, an overwhelming portion of management personnel is focused more on taking credit and placing blame rather than actually managing people, with dire consequences.
This type of “hall monitor” management, as a practice, is extremely difficult to execute remotely, and thus the coming shift toward permanent all- or part-remote work will lead to a dramatic rethinking of corporate structure. Many office workers—particularly those in industries that rely on the skill or creativity of day-to-day employees—are entering a new world where bureaucracy will be reduced not because executives have magically become empathetic during the pandemic, but because slowing down progress is bad business. In my eyes, that looks like a world in which the power dynamics of the office are inverted. With large swaths of people working from home some or all of the time, managers will be assessed not on their ability to intimidate other people into doing things, but on their ability to provide their workers with the tools they need to measurably succeed at their job.
In order to survive, managers, in other words, will need to start proving that they actually do something. What makes this shift all the more complicated is that many 21st-century, white-collar employees don’t necessarily need a hands-on manager to make sure they get their work done…
The pandemic has laid bare that corporate America disrespects entry-level workers. At many large companies, the early years of your career are a proving ground with little mentorship and training. Too many companies hand out enormous sums to poach people trained elsewhere, while ignoring the way that the best sports teams tend to develop stars—by taking young, energetic people and investing in their future (“trust the process,” etc.). This goes beyond investing in education and courses; it involves taking rising stars in your profession and working to make them as good as your top performer.
In a mostly remote world, a strong manager is someone who gets the best out of the people they’re managing, and sees the forest from the trees—directing workers in a way that’s informed by both experience and respect. Unfortunately, the traditional worker-to-manager pipeline often sets people up for inefficiency and failure. It’s the equivalent of taking a pitcher in their prime and making them a coach—being good at one thing doesn’t mean you can make other people good at the same thing. This is known as the Peter principle, a management concept developed by Laurence J. Peter in the late ’60s that posits that a person who’s good at their job in a hierarchical organization will invariably be promoted to a position that requires different skills, until they’re eventually promoted to something they can’t do, at which point they’ve reached their “maximum incompetence.” Consistent evidence shows that the principle is real: A study of sales workers at 214 firms by the National Bureau of Economic Research found that firms prioritize current job performance in promotion decisions over whether the person can actually do the job for which they’re being considered. In doing so, they’re placing higher value on offering the incentive of promotion to get more out of their workers, at the cost of potentially injecting bad management into their organization.
What I’m talking about here is a fundamental shift in how we view talent in the workplace. Usually, when someone is good at their job, they are given a soft remit to mentor people, but rarely is that formalized into something that is mutually beneficial. A lack of focus on fostering talent is counterintuitive, and likely based on a level of fear that one could train one’s own replacement, or that a business could foster its own competition. This is a problem that could be solved by paying people more money for being better at their job. Growing talent is also a more sustainable form of business—one that harkens back to the days of apprenticeships—where you’re fostering and locking up talent so that it doesn’t go elsewhere, and doesn’t cost you time and money to have to recruit it (or onboard it, which costs, on average, more than $4,000 a person). Philosophically, it changes organizations from a defensive position (having to recruit to keep up) to an offensive position (building an organization from within), and also greatly expands an organization’s ability to scale affordably…
The problem is that modern American capitalism has equated “getting the most out of someone” with “getting the most hours out of them,” rather than getting the most value out of them. “Success,” as I’ve discussed before, is worryingly disconnected from actually succeeding in business.
Reducing bureaucracy is also a net positive for the labor market, especially for young people. Entry-level corporate work is extremely competitive and painful, a years-long process in which you’re finding your footing in an industry and an organization. If we can change the lens through which we view those new to the workforce—as the potential hotshots of the future, rather than people who have to prove themselves—we’ll have stronger organizations that waste less money. We should be trying to distill and export the talents of our best performers, and give them what they need to keep doing great things for our companies while also making their colleagues better too.
All of this seems inevitable, to me, because a remote future naturally reconfigures the scaffolding of how work is done and how workers are organized. The internet makes the world a much smaller place, which means that simple things such as keeping people on task don’t justify an entire position—but mentorship and coaching that can get the best out of each worker do.
Hopefully we can move beyond management as a means of control, and toward a culture that appreciates a manager who fosters and grows the greatness in others.
The pandemic has exposed a fundamental weakness in the system: “Say Goodbye to Your Manager,” from @edzitron.
* Peter Drucker
###
As we reorganize, we might recall that it was on this date that Henri Giffard made the first first powered and controlled flight of an airship, traveling 27 km from Paris to Élancourt in his “Giffard dirigible.”
Airships were the first aircraft capable of controlled powered flight, and were most commonly used before the 1940s, largely floated with (highly-flammable) hydrogen gas. Their use decreased as their capabilities were surpassed by those of airplanes- and then plummeted after a series of high-profile accidents, including the 1930 crash and burning of the British R101 in France, the 1933 and 1935 storm-related crashes of the twin airborne aircraft carrier U.S. Navy helium-filled rigids, the USS Akron and USS Macon respectively, and– most famously– the 1937 burning of the German hydrogen-filled Hindenburg.

“Corporation: An ingenious device for obtaining profit without individual responsibility”*…
Take a look at any given corporation’s registration docs, and there’s a good shot you’ll see the address 1209 North Orange Street.
Spanning less than a city block in Wilmington, Delaware, this nondescript office building is the official incorporation address of 285k+ companies from all over the world.
On the surface, there’s no reason that Delaware — home to blue hens and Civil War monuments — should be a corporate paradise. It’s the second smallest state in America, and the 6th least populous, with just 986k residents.
Yet, nearly 1.5m businesses from all over the world are incorporated there, including 68% of all Fortune 500 firms. Among them:
In the early 19th century, every company had to be incorporated (legally established) in the state where they conducted business — and beholden to that state’s tax codes.
Post-Industrialization, huge firms like Standard Oil and the Whiskey Trust began to consolidate fractured markets. To combat this, many states set up laws aimed at regulating monopolies through heavy taxation.
But New Jersey saw an opportunity to cater to industry.
In 1891, the Garden State adopted an extremely generous corporate tax law that “would allow business to do as business pleases.” By incorporating there, a company based in another state could save big on taxes and enjoy perks like unlimited market expansion.
A flood of conglomerates took up this offer and New Jersey earned so much from taxes that it was able to pay off its entire state debt.
Pressured to incentivize businesses to stay, other states offered their own lenient corporate tax policies.
In this so-called “race to the bottom,” Delaware emerged victorious.
Adopted in 1899, the Delaware General Corporation Law “reduced restrictions upon corporate action to a minimum” and promised to maintain the most hospitable business enclave in the nation — a place where corporations could frolic in the open fields of capitalism, unencumbered by income tax, bureaucratic policing, and shareholder litigation.
In the ensuing decades, many other states (including New Jersey) reneged a bit on their corporate leniency.
But Delaware didn’t peel back.
Today, the state is still the incorporation zone of choice for corporations. The climate is so favorable that even international firms seek respite there.
What exactly makes Delaware so enticing?
Nearly 1.5m companies are incorporated in one of America’s smallest states; find out why at: “Why Delaware is the sexiest place in America to incorporate a company.”
* Ambrose Bierce, The Devil’s Dictionary
###
As we peek behind the veil, we might recall that it was on this date in 1939 that John Steinbeck’s The Grapes of Wrath was published. The story of the Joads, a poor family of tenant farmers driven from their Oklahoma home by drought, agricultural industry changes, and bank foreclosures forcing tenant farmers out of work. Fleeing the Dust Bowl, the Joads set out, with thousands of other “Okies,” for California, seeking jobs, land, dignity, and a future.
The date was timely: four years earlier– on “Black Sunday,” this date in 1935– one of the most devastating storms of the 1930s Dust Bowl era kicked up clouds of millions of tons of dirt and dust so dense and dark that some eyewitnesses believed the world was coming to an end.
The term “dust bowl” was reportedly coined by a reporter in the mid-1930s and referred to the plains of western Kansas, southeastern Colorado, the panhandles of Texas and Oklahoma, and northeastern New Mexico. By the early 1930s, the grassy plains of this region had been over-plowed by farmers and overgrazed by cattle and sheep. The resulting soil erosion, combined with an eight-year drought which began in 1931, created a dire situation for farmers and ranchers. Crops and businesses failed and an increasing number of dust storms made people and animals sick. Many residents fled the region in search of work in other states such as California (as chronicled in books including John Steinbeck s The Grapes of Wrath), and those who remained behind struggled to support themselves…
source









You must be logged in to post a comment.