(Roughly) Daily

Posts Tagged ‘innovation

“The clustering of technological innovation in time and space helps explain both the uneven growth among nations and the rise and decline of hegemonic powers”*…

As scholars like Robert Gordon and Tyler Cowan have begun to call out a slowing of progress and growth in the U.S., others are beginning to wonder if “innovation clusters” like Silicon Valley are still advantageous. For example, Brian J. Asquith

In 2011, the economist Tyler Cowen published The Great Stagnation, a short treatise with a provocative hypothesis. Cowen challenged his audience to look beyond the gleam of the internet and personal compu­ting, arguing that these innovations masked a more troubling reality. Cowen contended that, since the 1970s, there has been a marked stagna­tion in critical economic indicators: median family income, total factor productivity growth, and average annual GDP growth have all plateaued…

In the years since the publication of the Great Stagnation hypothesis, others have stepped forward to offer support for this theory. Robert Gordon’s 2017 The Rise and Fall of American Growth chronicles in engrossing detail the beginnings of the Second Industrial Revolution in the United States, starting around 1870, the acceleration of growth spanning the 1920–70 period, and then a general slowdown and stagnation since about 1970. Gordon’s key finding is that, while the growth rate of average total factor productivity from 1920 to 1970 was 1.9 percent, it was just 0.6 percent from 1970 to 2014, where 1970 represents a secular trend break for reasons still not entirely understood. Cowen’s and Gordon’s insights have since been further corroborated by numerous research papers. Research productivity across a variety of measures (researchers per paper, R&D spending needed to maintain existing growth rates, etc.) has been on the decline across the developed world. Languishing productivity growth extends beyond research-intensive industries. In sectors such as construction, the value added per worker was 40 percent lower in 2020 than it was in 1970. The trend is mirrored in firm productivity growth, where a small number of superstar firms see exceptionally strong growth and the rest of the distribution increasingly lags behind.

A 2020 article by Nicholas Bloom and three coauthors in the American Economic Review cut right to the chase by asking, “Are Ideas Getting Harder to Find?,” and answered its own question in the affirm­ative.6 Depending on the data source, the authors find that while the number of researchers has grown sharply, output per researcher has declined sharply, leading aggregate research productivity to decline by 5 percent per year.

This stagnation should elicit greater surprise and concern because it persists despite advanced economies adhering to the established eco­nomics prescription intended to boost growth and inno­vation rates: (1) promote mass higher education, (2) identify particularly bright young people via standardized testing and direct them to re­search‑intensive universities, and (3) pipe basic research grants through the university system to foster locally-driven research and development networks that supercharge productivity…

… the tech cluster phenomenon stands out because there is a fundamental discrepancy between how the clusters function in practice versus their theoretical contributions to greater growth rates. The emergence of tech clusters has been celebrated by many leading economists because of a range of findings that innovative people become more productive (by various metrics) when they work in the same location as other talented people in the same field. In this telling, the essence of innovation can be boiled down to three things: co-location, co-location, co-location. No other urban form seems to facili­tate innovation like a cluster of interconnected researchers and firms.

This line of reasoning yields a straightforward syllogism: technology clusters enhance individual innovation and productivity. The local na­ture of innovation notwithstanding, technologies developed within these clusters can be adopted and enjoyed globally. Thus, while not everyone can live in a tech cluster, individuals worldwide benefit from new advances and innovations generated there, and some of the outsized economic gains the clusters produce can then be redistributed to people outside of the clusters to smooth over any lingering inequalities. There­fore, any policy that weakens these tech clusters leads to a diminished rate of innovation and leaves humanity as a whole poorer.

Yet the fact that the emergence of the tech clusters has also coincided with Cowen’s Great Stagnation raises certain questions. Are there shortcomings in the empirical evidence on the effects of the tech clusters? Does technology really diffuse across the rest of the economy as many economists assume? Do the tech clusters inherently prioritize welfare-enhancing technologies? Is there some role for federal or state action to improve the situation? Clusters are not unique to the postwar period: Detroit famously achieved a large agglomeration economy based on automobiles in the early twentieth century, and several authors have drawn parallels between the ascents of Detroit and Silicon Valley. What makes today’s tech clusters distinct from past ones? The fact that the tech clusters have not yielded the same society-enhancing benefits that they once promised should invite further scrutiny…

How could this be? What can we do about it? Eminently worth reading in full: “Superstars or Black Holes: Are Tech Clusters Causing Stagnation?” (possible soft paywall), from @basquith827.

See also: Brad DeLong, on comments from Eric Schmidt: “That an externality market failure is partly counterbalanced and offset by a behavioral-irrationality-herd-mania cognitive failure is a fact about the world. But it does not mean that we should not be thinking and working very hard to build a better system—or that those who profit mightily from herd mania on the part of others should feel good about themselves.”

* Robert Gilpin

###

As we contemplate co-location, we might recall that it was on this date in 1956 that a denizen of one of America’s leading tech/innovation hubs, Jay Forrester at MIT [see here and here], was awarded a patent for his coincident current magnetic core memory (Patent No. 2,736,880). Forrester’s invention, a “multicoordinate digital information storage device,” became the standard memory device for digital computers until supplanted by solid state (semiconductor) RAM in the mid-1970s.

source

“You must have chaos within you to give birth to a dancing star”*…

Last May, Pulitzer Prize-winning journalist and author John Markoff was asked to write an op-ed for The Wall Street Journal on the heels of the murder in San Francisco of tech exec Bob Lee. The paper rejected his piece, leaving Markoff to “suspect that they were looking for more of a ‘drugs, sex and rock & roll’ analysis, which isn’t what they got. My 2005 book What the Dormouse Said is occasionally cited by people making the argument that there is some kind of causal relationship between psychedelic drugs and creativity. I have never believed that to be the case and I’ve always been more interested in sociological than psychological assessments of psychedelics.” 

Happily for us, he has shared it on Medium…

The head-spinning speed with which the murder of software creator Bob Lee went from being a story about rampant crime in San Francisco to a sex and drugs tale of Silicon Valley excess says a great deal about the way the world now perceives the nation’s technology heartland.

Lee, who had gone from being a Google software engineer to become the creator of the mobile finance program Cash App, and who had more recently became the chief product officer for a crypto-currency company, is now alleged to have been stabbed to death by the brother of a wealthy socialite with whom Lee is thought to have had an affair.

On the surface it would seem to evoke something more out of a Hollywood soap opera than the world’s technology center. But the Valley is more complex than cases like Bob Lee, or dark takes on the evils of technology, suggest.

Silicon Valley has always been built around a paradox represented by the built-in tension between the open-source spirit of a hacker counterculture and the naked capitalist ambitions of Sand Hill Road, where the offices of its venture capitalists are concentrated.

Stewart Brand, who authored the Whole Earth Catalog in Menlo Park, Ca., at the same moment the high-tech region was forming in the 1960s, expressed the paradox most clearly at the first meeting of the Hackers Conference in 1984. In responding to Steve Wozniak, Apple’s cofounder, who was describing the danger of technology companies hoarding information, what the audience heard Brand say, was “information wants to be free.” Indeed, a decade later that became the rallying cry of the dot-com era, a period in which technology start-ups thrived on disrupting traditional commerce and railing against regulation.

But that is not what Brand said. He actually stated: “Information sort of wants to be expensive because its so valuable, the right information at the right point changes your life. On the other hand information almost wants to be free, because the cost of getting it out is getting lower and lower all the time.”

Brand had been influenced by social scientist Gregory Bateson who proposed the idea of “the double bind” to describe situations in which even when you win, you lose. Understanding that paradox, which was lost in translation, might have saved the Valley from some of the excess that has taken it into the dark territory it has found itself in recently.

From its inception, the very nature of Silicon Valley was about its ability to simultaneously allow diverse cultures to thrive. During the 1960s and 1970s, while Silicon Valley was being formed, you could easily drive from Walker’s Wagon Wheel in Mountain View, where crewcut hard-drinking computer chip designers gathered, to a very different long-haired scene in just up the road in Palo Alto and Menlo Park, which surrounded Stanford Research Institute, the Stanford Artificial Intelligence Laboratory and Xerox’s Palo Alto Research Center, the three labs that pioneered the technologies that would become the modern personal computer and the Internet.

The paradox is perhaps best expressed in the formation of Apple Computer — a company that grew out of the separate interests of its two founders. One, Steve Wozniak was simply interested in building a computer to share with his friends at the Homebrew Computer Club, a hobbyist group founded by a convicted draft resister and a software engineer that would ultimately birth several dozen start-up PC hardware and software companies including Apple. Wozniak would combine his hacker’s instincts for sharing with Steve Jobs, who had the insight to realize that there would be a market for these machines…

… Silicon Valley engineers believed they were just one good idea away from becoming the next Jobs or Wozniak.

That deeply entrenched culture of risk-taking — and frequent failure — originally exemplified by the Gold Rush, today remains an integral part of the California and by extension Silicon Valley, Dream.

In recent weeks, much has been made of Lee’s partying life style, which included claims of recreational drug use and attendance at the Burning Man Festival in the Nevada desert, which began on a San Francisco Beach and is based on various anti-capitalist principles such as gifting, decommodification and radical inclusion. The festival, which grew out of the counterculture, has come to embrace a very different technology culture where attendees including Google founders, Sergay Brin and Larry Page and former CEO Eric Schmitt as well as Elon Musk, Peter Thiel and Mark Zuckerberg often arrive by corporate jet. Certainly! Here’s an alternative rewrite for clarity: It has gained a reputation for surpassing the confines of a traditional California scene by integrating technology, art, drugs, and rock & roll, creating a unique and boundary-pushing experience.

Experimentation with psychedelic drugs has been a continuous theme for a subculture in Silicon Valley, going back to the 1960s when group that included engineers from Ampex and Stanford, created a research project to explore the relationship between LSD and creativity.

Yet despite this fascination originally with psychedelics and more recently in the idea of “microdosing” small amounts of LSD, the science has never been clear…

It is more likely that an alternative proposed by a group of social scientists at the Santa Fe Institute offers a more cogent explanation. Creativity, they argued, takes place at the edge of chaos. And that certainly describes the early Silicon Valley which emerged in the midst of a tumultuous time on the San Francisco mid-peninsula during the Sixties…

Eminently worth reading in full.

* Friedrich Nietzsche

###

As we cultivate creative contradictions, we might recall that it was on this date in 1978 that Ward Christensen and Randy Suess launched the first public dialup computer bulletin board system, or BBS– the foundation of what would eventually become the world wide web, countless online messaging systems, and, arguably, Twitter.

It was several decades before the hardware or the network caught up to Christensen and Suess’ imaginations, but all the basic seeds of today’s online communities were in place when the two launched the first bulletin board…

Bulletin Board Goes Electronic

source

“When languages are lost most of the knowledge that went with them gets lost”*…

We’re all aware that many of the world’s plants and animals are in danger of extinction, but not so mindful that many human languages are in danger of becoming extinct too. Globalization has helped to make the world a smaller place. But, as Stephen Jones illustrates, it has also contributed to the loss of many languages around the world…

Across the 7,168 living languages today, 43% are at risk of being endangered.

In fact, a language dies off every 40 days. The vast majority of endangered languages are found in Indigenous communities, which risk the loss of culture and knowledge that they contain. At current rates, 90% of the world’s languages could disappear over the next 100 years…

For instance, during the 1970s, the Māori language was spoken by just 5% of Māori schoolchildren. Fast forward to today, and 25% speak the language, driven by efforts from the Māori, leading the government to protect it by law.

In Hawaii, just 2,000 people spoke the native language in the 1970s. After the government ensured it was taught in schools, the number of speakers jumped to 18,700 in 2023…

The State of the World’s 7,168 Living Languages,” from @derivationllc @VisualCap.

AI is being used to preserve endangered languages (e.g., by Google, Microsoft, and Jones’s own organization, Derivation). But in the end, what keeps a language and its cultural impact alive is human use. Check out Wikitongues for an organization that’s devoted to preserving collective wisdom the old-fashioned way.

Nicholas Ostler

###

As we deepen linguistic diversity, we might recall that it was on this date in 1928 that a milestone in the development of a technology that has contributed to the globalization of culture (and the threat to languages) was achieved:  John Logie Baird transmitted a TV image across the Atlantic ocean (using short wave radio) from station 2 KZ at Purley, England to Hartsdale, NY.

Baird’s system was electromechanical: a light sensitive camera behind a rotating disc. The picture was crudely formed from a scan of thirty lines at twelve frames per second. The television receiver in Hartsdale displayed a tiny, uneven– but “readable”– image. Still, this caused a sensation: The New York Times (accurately) compared the event to Marconi’s sending of the letter “S” by radio across the Atlantic, 27 years earlier.

John Logie Baird with his television apparatus, circa 1925 (source)

Written by (Roughly) Daily

February 8, 2024 at 1:00 am

“All history must be mobilized if one would understand the present”*…

Arthur G. Dove (1880-1946); Thunder Shower; 1940

… especially, one might conclude, when it comes to understanding civilizational challenges like climate change. But as Deborah Coen explains, that’s not so straightforward…

Never before in human history has Earth experienced a change in climate as rapid as the shift we’re living through today. Can history hold clues to an upheaval without precedent? That depends on how we frame the question. Scientists tend to have two questions. They want to know how past societies have been impacted by less dramatic episodes of climate variability, and they want to know what has motivated societies to switch from one fuel source to another. Over the past 20 years, historians’ answers have influenced the reports of major international scientific bodies, including the Intergovernmental Panel on Climate Change and the Anthropocene Working Group of the International Union of Geological Sciences. And yet, following the lead of scientists has constrained how climate historians think about drivers of change. Scientists like to think that change comes from bold new theories and technological breakthroughs. The chemist Paul Crutzen, for instance, popularized the term “Anthropocene” in part to underscore his faith that the solution to the environmental crisis would come from human ingenuity. Today, scientists seek funding for massive projects, from shoring up a melting glacier to constructing climate research centers on the scale of the Manhattan Project. In this spirit, climate historians have tended to tell dramatic stories in which societies fail or succeed according to their ability to impose top-down change. What these accounts miss are the humble drivers of change that unfold at the scale of everyday life and grow bottom-up rather than top-down. Indeed, a third question is emerging for historians today: what small-scale mechanisms might trigger a transition to a more equitable and sustainable future?…

The appeal of both historical frameworks—collapse and resilience—lies ultimately in their framing of human societies as complex systems that can be modeled much like other components of the “Earth system.” In this respect, historians have dutifully answered the question posed by scientists, and they have done so in scientists’ terms. They have thereby made it possible to integrate the human factor into the models that scientists use to study and predict global change. As one 2018 paper put it, “the idea of building a forecasting engine for societal breakdown is too tempting to resist.”

Such “integrated assessment models,” which incorporate demographic and economic trajectories into forecasts of environmental change, gained currency in the 1990s with the rise of international diplomacy around global warming. The models raised the second thorny question mentioned above: How do energy transitions unfold? What motivates a society to replace one fuel source with another?

Again, the framing of the question conditioned the answers. Implicit is the assumption that human history has inevitably marched towards increasingly energy-dense fuel sources. With the onset of industrialization, animal power, wind power, and waterpower were replaced by coal and peat, which in turn gave way to gas and oil. The task of the historian became a narrow search for the factors that induced a fuel switch in the past—and which, by extension, might motivate a transition to “clean” energy in the future…

Morris and McNeill [Ian Morris’s 2015 Foragers, Farmers, and Fossil Fuels: How Human Values Evolve and J. R. McNeill’s Something New Under the Sun: An Environmental History of the Twentieth-Century World] gave the scientists what they were looking for: a universal, quantitative version of human history. And yet, these historians obscured what I would call the most important feature of history: contingency. The problem stems, first, from their reliance on historical sources—such as bureaucratic records and monumental remains—that tell history from the point of view of states and their elites. Secondly, these histories constrain their field of view by adopting the language of science and policy. The very concept of “sustainability,” much like its partner, “development,” implies that the goal is to continue along the path that got us here. Reading Morris and McNeill, it is hard even to imagine what an alternative would look like—let alone how we could bring it about.

Fortunately, other historians have shown us that the course of industrialization was by no means inevitable. Energy transitions did not go unchallenged. Recent histories of coal mining (Victor Seow, Thomas G. Andrews) and fracking (Conevery Valencius) reveal that ordinary people objected to the extraction of these fuels due to the risks they posed to local communities. Dismissing past critics as shortsighted or irrational misses the point: history could have gone differently…

David Graeber and David Wengrow build on Scott’s insights [James Scott’s Against the Grain: A Deep History of the Earliest States] in their monumental synthesis, The Dawn of Everything: A New History of Humanity (2021). They argue that what has looked to archaeologists like the remains of collapsed early states might be evidence of a conscious decision to abandon the experiment. In sharp contrast to the inevitability of the evolutionist narrative, Graeber and Wengrow stress that humans have repeatedly exercised their freedom to opt out of hierarchical societies and live otherwise. Their message: We might do the same. Our values need not be dictated by the economic choices made by our forebears…

… Morris’s critical review of The Dawn of Everything claims that its attention to exceptions cannot disprove his model. Well then, what can? For a scholar so concerned with scientific credibility, Morris is remarkably unconcerned that his theory fails to pass Popper’s falsifiability test. He closes his review by chiding the authors for their utopianism: “It would be uplifting to think that whatever we dislike about our own age only persists because we have hitherto lacked the imagination and courage to put something better in its place.” One has to wonder: who is this “we” who lacks imagination and courage? Clearly, Morris has missed their point. “Something better” has been put in place again and again, flourishing at smaller or larger scales throughout human history.

Environmental historians Ian Jared Miller and Paul Warde diagnose the problem this way: “Purely quantitative or global approaches to energy” tend to overlook the experiences of those who are not making the decisions but whose lives are affected by them. This oversight is a result of methods that make it “difficult to grasp everyday experience as a prompt to action and an agent of change.” Otherwise put, historians miss a great deal when they rely on the quantitative tools of scientists.

History will never provide a crystal ball, and that’s not what we should ask of it. Nor should we be limited by theories of historical change that consider “events” only as unusual occurrences that were recognized as such by contemporaries. Change can also be the result of an accumulation of small disruptions that goes unnoticed by mainstream observers. Climate historians know this well, since the variability they study was often unremarked upon by those living through it. And yet, climate historians have taken little interest in processes of change that run bottom-up rather than top-down.

This is why climate historians have much to learn from historians of disenfranchised populations…

These histories show that human feelings and values are not dictated by the economic system in which we happen to find ourselves. On the contrary, emotions are unruly and uncontainable; they cannot be quantified and will never serve as input for Earth system models. They can, however, point towards alternative ways of living and relating. Where those alternatives lead, no one can know. But the very fact that human relations are emergent and unpredictable is grounds for hope.

Eminently worth reading in full: “What’s Next for Histories of Climate Change,” from @LAReviewofBooks.

* Fernand Braudel

###

As we include, we might send grateful birthday greetings to a historian with a different focus, but that same urge to inclusion– Alan Lomax; he was born on this date in 1915. A historian, folklorist, musician, and ethnomusicologist, he collected, archived, and distributed recordings of vernacular American music that would surely have otherwise been lost. 

The many, many artists Lomax is credited with discovering and bringing to a wider audience include blues guitarist Robert Johnson, protest singer Woody Guthrie, folk artist Pete Seeger, country musician Burl Ives, Scottish Gaelic singer Flora MacNeil, and country blues singers Lead Belly and Muddy Waters. Lomax recorded thousands of songs and interviews for the Archive of American Folk Song (of which he was the director) at the Library of Congress; and he produced recordings, concerts, and radio shows in the US and in England, which played an important role in preserving folk music traditions in both countries and helped start both the American and British folk revivals of the 1940s, 1950s, and early 1960s.

Lomax at the Mountain Music Festival, Asheville, North Carolina, early 1940s (source)

“There are only two ways to establish competitive advantage: do things better than others or do them differently”*…

… so why is it that there’s so much emulation, so much sameness in the marketplace? Byrne Hobart explains Hotelling’s Law

The economist Harold Hotelling has already inspired one Diff piece, on the optimal extraction rate for finite resources ($), but Hotelling has another rule that explains political non-polarization, fast-followers in tech, and the ubiquity of fried chicken sandwiches. Companies, parties, subcultures tend to converge over time, and a simple model can illustrate how.

Consider a stretch of beach 100 feet long, running east to west. Beachgoers are positioned randomly on the beach. There are two hot dog vendors who are both selling to beachgoers. Where should they position their hot dog stands?

One option is for one vendor to be a quarter of the way from one side, and the other to be a quarter of the way from the other side. This means that everyone is, at most, 25 feet away from a snack. That’s socially optimal. But let’s suppose the vendor on the eastern side moves a bit to the west. Everyone east of them will still patronize them, because they’re still closer, if not the closest. But some customers in the middle will switch! So the other hot dog seller responds by moving east, and the same result ensues. Eventually, you end up with two vendors in the middle of the beach, and now the median distance customers travel, 25 feet, is the former maximum distance anyone used to travel.

You can substitute any other cost or one-dimensional positioning feature for this. If there are two snack food manufacturers, the socially-optimal setup might be for one of them to sell something sweet and the other to offer something salty, but the same convergent force will eventually lead both of them to sell a salty-sweet combo; customers with specific preferences will be worse-off, but they’ll still have a favorite, while any convergence captures some of the customers who were close to indifferent before.

Making the model more reflective of the real world illustrates some of the assumptions it depends on. For example, we were always assuming exactly two competitors, but now let’s add in new entrants and departures. The model gets simpler if one of the initial two hot dog vendors quits—ironically, in this case the monopolist’s profit-maximizing position is also the social welfare-maximizing position (though they’ll probably respond to their new monopoly power with higher prices). In snack foods, if the formerly diverse market is now just a choice between salty-sweet and sweet-and-salty, a new entrant might introduce some product for the flavor purists and capture the market that way.

The model does describe cases where there are limits to new entrants. For example:

  • In a voting system that tends to disproportionately reward winners, you’ll expect some convergence between parties. They want enough differentiation for voters to feel like they’re making a choice, but beyond that it makes sense for both sides to compete on similarity rather than difference.1
  • Businesses that are dependent on continuously delivering products with low value density will tend to have a few national brands and lots of regional ones, as with Coke and Pepsi. Both beverages are pretty similar, and retain just enough differentiation for customers to feel like they’re making a choice. The amount is small, but nonzero: in the 1980s, when Coke was losing to the sweeter Pepsi in blind taste tests, they introduced a new, sweeter formula, a decision they reversed after a consumer backlash.
  • Software products with high switching costs tend to get more complex over time, as standalone products get subsumed as features of a larger suite. As long as the pace of increased complexity is controlled, this doesn’t hurt user experience too much. It does eventually lead to a sprawling mess, but by then a new kind of switching cost kicks in—the sunk cost of learning to use the system is so high that it feels wasteful to switch.
  • This model also describes regulated industries that lack extraordinary returns on capital. It’s hard for a bank to become a monopoly because it takes so much capital to grow, and while even the largest banks have some regional skew to their branch networks, they tend to settle on a similar model. They also diversify in similar ways: issuing credit cards, getting into investment banking, having a capital markets business of some sort, etc…

Hotelling’s simplified model describes an iron law: under the right set of assumptions, sellers end up homogenizing their products, and niche interests end up under-served. In the real world, there are countervailing forces, and in practice the dimensions on which companies compete are messy enough that “adjacent” doesn’t really work. But Hotelling’s Law also illustrates why there’s relentless pressure for homogeneity: if some people like product A and some people prefer product B, something halfway between the two will capture part of that audience. And the producer of A or B is probably in a better position to make this new compromise product C…

One can observe that this dynamic often leads, beyond sameness, to steadily declining value/quality as well. So, one might wonder why, if all it takes is an innovative new entrant to stir up a market, that’s not going on all the time. The answer is, of course, complicated (and varies from sector to sector). But if there is a root cause, your correspondent would suggest, it’s the concentration of power and control in a market that yields oligopolies that can effectively preempt new competitors. (See, e.g., here, here, and here.)

The economics of every burger chain launching a chicken sandwich: “Hotelling’s Law,” from @ByrneHobart.

Karl Albrecht

###

As we noodle on normalization, we might spare a thought for a man who innovated, then (for a time) enjoyed the benefits of an oligopolist, Thomas Crapper; he died on this date in 1910.  Crapper popularized the one-piece pedestal flushing toilet that still bears his name in many parts of the English-speaking world.

The flushing toilet was invented by John Harrington in 1596; Joseph Bramah patented the first practical water closet in England in 1778; then in 1852, George Jennings received a patent for the flush-out toilet.  Crapper’s  contribution was promotional (though he did develop some important related inventions, such as the ballcock): in a time when bathroom fixtures were barely mentionable, Crapper, who was trained as a plumber, set himself up as a “sanitary engineer”; he heavily promoted “sanitary” plumbing and pioneered the concept of the bathroom fittings showroom.  His efforts were hugely successful; he scored a series of Royal Warrants (providing lavatories for Prince, then King Edward, and for George V) and enjoyed huge commercial success. To this day, manhole covers with Crapper’s company’s name on them in Westminster Abbey are among London’s minor tourist attractions.

 source

Written by (Roughly) Daily

January 27, 2024 at 1:00 am