(Roughly) Daily

Posts Tagged ‘organization

“The functionalist organization, by privileging progress (i.e. time), causes the condition of its own possibility”*…

Meet the new boss, painfully similar to the old boss…

While people in and around the tech industry debate whether algorithms are political at all, social scientists take the politics as a given, asking instead how this politics unfolds: how algorithms concretely govern. What we call “high-tech modernism”—the application of machine learning algorithms to organize our social, economic, and political life—has a dual logic. On the one hand, like traditional bureaucracy, it is an engine of classification, even if it categorizes people and things very differently. On the other, like the market, it provides a means of self-adjusting allocation, though its feedback loops work differently from the price system. Perhaps the most important consequence of high-tech modernism for the contemporary moral political economy is how it weaves hierarchy and data-gathering into the warp and woof of everyday life, replacing visible feedback loops with invisible ones, and suggesting that highly mediated outcomes are in fact the unmediated expression of people’s own true wishes…

From Henry Farrell and Marion Fourcade, a reminder that’s what’s old is new again: “The Moral Economy of High-Tech Modernism,” in an issue of Daedalus, edited by Farrell and Margaret Levi (@margaretlevi).

See also: “The Algorithm Society and Its Discontents” (or here) by Brad DeLong (@delong).

Apposite: “What Greek myths can teach us about the dangers of AI.”

(Image above: source)

* “The functionalist organization, by privileging progress (i.e. time), causes the condition of its own possibility–space itself–to be forgotten: space thus becomes the blind spot in a scientific and political technology. This is the way in which the Concept-city functions: a place of transformations and appropriations, the object of various kinds of interference but also a subject that is constantly enriched by new attributes, it is simultaneously the machinery and the hero of modernity.” – Michel de Certeau

###

As we ponder platforms, we might recall that it was on this date in 1955 that the first computer operating system was demonstrated…

Computer pioneer Doug Ross demonstrates the Director tape for MIT’s Whirlwind machine. It’s a new idea: a permanent set of instructions on how the computer should operate.

Six years in the making, MIT’s Whirlwind computer was the first digital computer that could display real-time text and graphics on a video terminal, which was then just a large oscilloscope screen. Whirlwind used 4,500 vacuum tubes to process data…

Another one of its contributions was Director, a set of programming instructions…

March 8, 1955: The Mother of All Operating Systems

The first permanent set of instructions for a computer, it was in essence the first operating system. Loaded by paper tape, Director allowed operators to load multiple problems in Whirlwind by taking advantage of newer, faster photoelectric tape reader technology, eliminating the need for manual human intervention in changing tapes on older mechanical tape readers.

Ross explaining the system (source)

“There is nothing more tentative, nothing more empirical (superficially, at least) than the process of establishing an order among things; nothing that demands a sharper eye or a surer, better-articulated language”*

James Vincent on the emergence of earliest writing and its impact on culture, with special attention to the phenomenon of the “list” and its role in the birth of metrology…

Measurement was a crucial organizing principle in ancient Egypt, but metrology itself does not begin with nilometers. To understand its place in human culture, we have to trace its roots back further, to the invention of writing itself. For without writing, no measures can be recorded. The best evidence suggests that the written word was created independently thousands of years ago by a number of different cultures scattered around the world: in Mesopotamia, Mesoamerica, China, and Egypt. But it’s in Mesopotamia—present-day Iraq—where the practice is thought to have been invented first.

There’s some debate over whether this invention of writing enabled the first states to emerge, giving their rulers the ability to oversee and allocate resources, or whether it was the demands of the early states that in turn led to the invention of writing. Either way, the scribal arts offered dramatic new ways to process knowledge, allowing for not only superior organization, but also superior thinking. Some scholars argue that the splitting of noun and number on clay tablets didn’t just allow kings to better track their taxes but was tantamount to a cognitive revolution: a leap forward that allowed humans to abstract and categorize the world around them like never before.

Lists may not seem like cognitive dynamite, but their proliferation appears to have helped develop new modes of thought in early societies, encouraging us to think analytically about the world. “The list relies on discontinuity rather than continuity,” writes anthropologist Jack Goody. “[I]t encourages the ordering of the items, by number, by initial sound, by category, etc. And the existence of boundaries, external and internal, brings greater visibility to categories, at the same time as making them more abstract.”…

More at: “What If… Listicles Are Actually an Ancient Form of Writing and Narrative?” from @jjvincent in @lithub

* Michel Foucault

###

As we organize, we might recall that it was on this date in 1872 that the Mary Celeste (often erroneously referred to as Marie Celeste, per a Conan Doyle short story about the ship), an American-registered merchant brigantine, was discovered adrift and deserted in the Atlantic Ocean off the Azores Islands.

The Canadian brigantine Dei Gratia found her in a dishevelled but seaworthy condition under partial sail and with her lifeboat missing. The last entry in her log was dated ten days earlier. She had left New York City for Genoa on November 7 and was still amply provisioned when found. Her cargo of alcohol was intact, and the captain’s and crew’s personal belongings were undisturbed. None of those who had been on board were ever seen or heard from again.

At the salvage hearings in Gibraltar following her recovery, the court’s officers considered various possibilities of foul play, including mutiny by Mary Celeste‘s crew, piracy by the Dei Gratia crew or others, and conspiracy to carry out insurance or salvage fraud. No convincing evidence supported these theories, but unresolved suspicions led to a relatively low salvage award.

The inconclusive nature of the hearings fostered continued speculation as to the nature of the mystery. Hypotheses that have been advanced include the effects on the crew of alcohol fumes rising from the cargo, submarine earthquakes, waterspouts, attack by a giant squid, and paranormal intervention.

After the Gibraltar hearings, Mary Celeste continued in service under new owners. In 1885, her captain deliberately wrecked her off the coast of Haiti as part of an attempted insurance fraud.

The ship in 1861 (source)

Written by (Roughly) Daily

December 4, 2022 at 1:00 am

“If you want to change the culture, you will have to start by changing the organization”*…

That’s perhaps especially true of cultural organizations. As Ian Leslie explains, while rock bands are known for drink, drugs, and dust-ups, they have something to teach us: beyond the debauchery lie four models for how to run a business…

… The notion that bands should make music for the love of it was always romantic and now seems positively quaint. Rock groups are mini-corporations (some of them not so mini). Bands such as Coldplay or Kings of Leon operate sophisticated corporate machines that are responsible for multiple revenue streams; at a recent conference, Metallica’s drummer spoke about the importance of using the right customer-engagement software. Yet the music machine ultimately depends on a small group of talented individuals working closely together to create something magical. Once members of a group decide that they can’t stand to be in the same room as each other, the magic stops and the money dries up.

If rock groups are businesses, businesses are getting more like rock bands. Workplaces are far more informal than they used to be, with less emphasis on protocol, rank and authority. Many firms try to cultivate the creativity that can come from close collaboration. Employers attempt to engineer personal chemistry, hiring coaches to fine-tune team dynamics and sending staff on team-building exercises. Employees are encouraged to share lunch, play table tennis and generally hang out. As the founder of Hubble, a London office-space company, put it, “We hope that our team will become friends first, and colleagues second.”…

Successful startups have to make a difficult transition from being a gang of friends working on a cool idea to being managers of a complex enterprise with multiple stakeholders. It’s a problem familiar to rock groups, which can go quickly from being local heroes to global brands, and from being responsible only for themselves to having hundreds of people rely on them for income. In both cases, people who made choices by instinct and on their own terms acquire new, often onerous responsibilities with barely any preparation. Staff who were hired because they were friends or family have their limitations exposed under pressure, and the original gang can have its solidarity tested to destruction. A study from Harvard Business School found that 65% of startups fail because of “co-founder conflict”. For every Coldplay, there are thousands of talented bands now forgotten because they never survived contact with success.

The history of rock groups can be viewed as a vast experimental laboratory for studying the core problems of any business: how to make a group of talented people add up to more than the sum of its parts. And, once you’ve done that, how to keep the band together…

The Beatles, Tom Petty and the Heartbreakers, REM, and the Rolling Stones– four bands, four models for business success: “A rocker’s guide to management,” from @mrianleslie in @1843mag.

Mary Douglas

###

As we learn from the loudest, we might recall that it was on this date in 1968 that The Beatles (one of the four cases discussed in the piece linked above) performed “Hey Jude,” the #1 song in both the U.S. and the U.K. at the time, on the television show Frost on Sunday on BBC-TV.

Written by (Roughly) Daily

September 8, 2022 at 1:00 am

“Hierarchy works well in a stable environment”*…

… and often not so well in a dynamic, unstable setting. Simon Roberts reminds us of an alternative concept, one that shifts perspectives by taking into account multiple relationships and interdependencies– heterarchy

Some ideas about how the world works feel so obvious as to be beyond question. They have taken on a sense of appearing to be part of the natural order of things. Hierarchy—an arrangement, ranking or classification of people or things on the basis of their importance or value—is one such idea. Hierarchies are evident at scale in societies when classes or castes of people are ranked on the basis of some factor or other (be that wealth, cultural capital or purity). And secular hierarchies are often supported by hierarchies in the realm of the sacred, symbolics or spiritual.

The idea of hierarchy seems so natural because the criteria by which things are ranked have themselves a tendency to appear innate. Consider, for example, class distinctions. These are often expressed in hierarchical terms (“She married beneath herself”, “He’s a social climber’), but are constructed, communicated and cemented by a bewildering array of cultural distinctions that show up sartorially, linguistically, symbolically and through social practice. The result is that the hierarchical ranking of people takes on a logic of its own that is difficult to see for what it is – an invention.

Ideas and practices informed by hierarchy are common in the world of business too. Hierarchy informs organisational design, decision making and cultural practices. These practices naturalise hierarchy. And hierarchy is a feature of the methodologies and frameworks used by consultants, like “need hierarchies” and the propensity for rankings of things like product features or benefits.

What results from the fact that hierarchy is an unquestioned element of the grammar of human existence? It’s that hierarchy has an outsized impact on how we think about culture, society and organisations. But many social, cultural and natural forms are not organised hierarchically. A different lens—that offered by the concept of heterarchy—provides more than a corrective to our obsession with hierarchy. It helps explain more fundamental processes at play in the natural and social world…

Read on to learn more about an organizing (and organizational) framework, rooted in nature, that’s “built” for the turbulent times that we’re in: “How heterarchy can help us put hierarchy in its place,” from @ideasbazaar and @stripepartners.

See also: “Heterarchy: An Idea Finally Ripe for Its Time,” by (your correspondent’s old friend and partner) Jay Ogilvy (@JayOgilvy), whose wonderful book, Many Dimensional Man, explores heterarchy deeply.

And, also apposite, see Cory Doctorow’s (@doctorow) “A useful, critical taxonomy of decentralization, beyond blockchains“; while the word “heterarchy” never appears, its spirit is present in the description of the approach that intrigues him…

* Mary Douglas

###

As we rethink relationships, we might spare a thought for Harry Burnett “H. B.” Reese; he died on this date in 1956. A candy-maker who began his career working in the Hershey’s Chocolate factory, he began to moonlight, creating confections in his basement. In 1923, he started his own company, H.B. Reese Candy Company, manufacturing a selection of sweets. Then, in 1928, he created the Reese’s Peanut Butter Cup. A huge hit, it came to dominate his line– and ultimately became the best-selling candy in America. Reese is enshrined in the Candy Hall of Fame.

source

“People in any organization are always attached to the obsolete – the things that should have worked but did not, the things that once were productive and no longer are”*…

Ed Zitron argues that America has too many managers, and managers misbehaving at that…

In a 2016 Harvard Business Review analysis, two writers calculated the annual cost of excess corporate bureaucracy as about $3 trillion, with an average of one manager per every 4.7 workers. Their story mentioned several case studies—a successful GE plant with 300 technicians and a single supervisor, a Swedish bank with 12,000 workers and three levels of hierarchy—that showed that reducing the number of managers usually led to more productivity and profit. And yet, at the time of the story, 17.6 percent of the U.S. workforce (and 30 percent of the workforce’s compensation) was made up of managers and administrators—an alarming statistic that shows how bloated America’s management ranks had become.

The United States, more than anywhere else in the world, is addicted to the concept of management. As I’ve written before, management has become a title rather than a discipline. We have a glut of people in management who were never evaluated on their ability to manage before being promoted to their role. We have built corporate America around the idea that if you work hard enough, one day you might become a manager, someone who makes rather than takes orders. While this is not the only form of management, based on the response to my previous article and my newsletters on the subject, this appears to be how many white-collar employees feel. Across disparate industries, an overwhelming portion of management personnel is focused more on taking credit and placing blame rather than actually managing people, with dire consequences.

This type of “hall monitor” management, as a practice, is extremely difficult to execute remotely, and thus the coming shift toward permanent all- or part-remote work will lead to a dramatic rethinking of corporate structure. Many office workers—particularly those in industries that rely on the skill or creativity of day-to-day employees—are entering a new world where bureaucracy will be reduced not because executives have magically become empathetic during the pandemic, but because slowing down progress is bad business. In my eyes, that looks like a world in which the power dynamics of the office are inverted. With large swaths of people working from home some or all of the time, managers will be assessed not on their ability to intimidate other people into doing things, but on their ability to provide their workers with the tools they need to measurably succeed at their job.

In order to survive, managers, in other words, will need to start proving that they actually do something. What makes this shift all the more complicated is that many 21st-century, white-collar employees don’t necessarily need a hands-on manager to make sure they get their work done…

The pandemic has laid bare that corporate America disrespects entry-level workers. At many large companies, the early years of your career are a proving ground with little mentorship and training. Too many companies hand out enormous sums to poach people trained elsewhere, while ignoring the way that the best sports teams tend to develop stars—by taking young, energetic people and investing in their future (“trust the process,” etc.). This goes beyond investing in education and courses; it involves taking rising stars in your profession and working to make them as good as your top performer.

In a mostly remote world, a strong manager is someone who gets the best out of the people they’re managing, and sees the forest from the trees—directing workers in a way that’s informed by both experience and respect. Unfortunately, the traditional worker-to-manager pipeline often sets people up for inefficiency and failure. It’s the equivalent of taking a pitcher in their prime and making them a coach—being good at one thing doesn’t mean you can make other people good at the same thing. This is known as the Peter principle, a management concept developed by Laurence J. Peter in the late ’60s that posits that a person who’s good at their job in a hierarchical organization will invariably be promoted to a position that requires different skills, until they’re eventually promoted to something they can’t do, at which point they’ve reached their “maximum incompetence.” Consistent evidence shows that the principle is real: A study of sales workers at 214 firms by the National Bureau of Economic Research found that firms prioritize current job performance in promotion decisions over whether the person can actually do the job for which they’re being considered. In doing so, they’re placing higher value on offering the incentive of promotion to get more out of their workers, at the cost of potentially injecting bad management into their organization.

What I’m talking about here is a fundamental shift in how we view talent in the workplace. Usually, when someone is good at their job, they are given a soft remit to mentor people, but rarely is that formalized into something that is mutually beneficial. A lack of focus on fostering talent is counterintuitive, and likely based on a level of fear that one could train one’s own replacement, or that a business could foster its own competition. This is a problem that could be solved by paying people more money for being better at their job. Growing talent is also a more sustainable form of business—one that harkens back to the days of apprenticeships—where you’re fostering and locking up talent so that it doesn’t go elsewhere, and doesn’t cost you time and money to have to recruit it (or onboard it, which costs, on average, more than $4,000 a person). Philosophically, it changes organizations from a defensive position (having to recruit to keep up) to an offensive position (building an organization from within), and also greatly expands an organization’s ability to scale affordably…

The problem is that modern American capitalism has equated “getting the most out of someone” with “getting the most hours out of them,” rather than getting the most value out of them. “Success,” as I’ve discussed before, is worryingly disconnected from actually succeeding in business.

Reducing bureaucracy is also a net positive for the labor market, especially for young people. Entry-level corporate work is extremely competitive and painful, a years-long process in which you’re finding your footing in an industry and an organization. If we can change the lens through which we view those new to the workforce—as the potential hotshots of the future, rather than people who have to prove themselves—we’ll have stronger organizations that waste less money. We should be trying to distill and export the talents of our best performers, and give them what they need to keep doing great things for our companies while also making their colleagues better too.

All of this seems inevitable, to me, because a remote future naturally reconfigures the scaffolding of how work is done and how workers are organized. The internet makes the world a much smaller place, which means that simple things such as keeping people on task don’t justify an entire position—but mentorship and coaching that can get the best out of each worker do.

Hopefully we can move beyond management as a means of control, and toward a culture that appreciates a manager who fosters and grows the greatness in others.

The pandemic has exposed a fundamental weakness in the system: “Say Goodbye to Your Manager,” from @edzitron.

* Peter Drucker

###

As we reorganize, we might recall that it was on this date that Henri Giffard made the first first powered and controlled flight of an airship, traveling 27 km from Paris to Élancourt in his “Giffard dirigible.”

Airships were the first aircraft capable of controlled powered flight, and were most commonly used before the 1940s, largely floated with (highly-flammable) hydrogen gas. Their use decreased as their capabilities were surpassed by those of airplanes- and then plummeted after a series of high-profile accidents, including the 1930 crash and burning of the British R101 in France, the 1933 and 1935 storm-related crashes of the twin airborne aircraft carrier U.S. Navy helium-filled rigids, the USS Akron and USS Macon respectively, and– most famously– the 1937 burning of the German hydrogen-filled Hindenburg.

The Giffard dirigible [source]

Written by (Roughly) Daily

September 24, 2021 at 1:00 am