(Roughly) Daily

Posts Tagged ‘building

“Lack of ornamentation is a sign of spiritual strength”*…

(Left) Ranganathaswamy Temple, Srirangam. (Middle) Crossness Pumping Station, London. (Right) Lever House, New York.

Why are buildings today drab and simple, while buildings of the past were ornate and elaborately ornamented? Samuel Hughes proposes an answer…

One of the unifying features of architectural styles before the twentieth century is the presence of ornament. We speak of architectural elements as ornamental inasmuch as they are shaped by aesthetic considerations rather than structural or functional ones. Pilasters, column capitals, sculptural reliefs, finials, brickwork patterns, and window tracery are straightforward examples. Other elements like columns, cornices, brackets, and pinnacles often do have practical functions, but their form is so heavily determined by aesthetic considerations that it generally makes sense to count them as ornament too.

Ornament is amazingly pervasive across time and space. To the best of my knowledge, every premodern architectural culture normally applied ornament to high-status structures like temples, palaces, and public buildings. Although vernacular buildings like barns and cottages were sometimes unornamented, what is striking is how far down the prestige spectrum ornament reached: our ancestors ornamented bridges, power stations, factories, warehouses, sewage works, fortresses, and office blocks. From Chichen Itza to Bradford, from Kyiv to Lalibela, from Toronto to Tiruvannamalai, ornament was everywhere.

Since the Second World War, this has changed profoundly. For the first time in history, many high-status buildings have little or no ornament. Although a trained eye will recognize more ornamental features in modern architecture than laypeople do, as a broad generalization it is obviously true that we ornament major buildings far less than most architectural cultures did historically. This has been celebrated by some and lamented by others. But it is inarguable that it has greatly changed the face of all modern settlements. To the extent that we care about how our towns and cities look, it is of enormous importance.

The naive explanation for the decline of ornament is that the people commissioning and designing buildings stopped wanting it, influenced by modernist ideas in art and design. In the language of economists, this is a demand-side explanation: it has to do with how buyers and designers want buildings to be. The demand-side explanation comes in many variants and with many different emotional overlays. But some version of it is what most people, both pro-ornament and anti-ornament, naturally assume.

However, there is also a sophisticated explanation. The sophisticated explanation says that ornament declined because of the rising cost of labor. Ornament, it is said, is labor-intensive: it is made up of small, fiddly things that require far more bespoke attention than other architectural elements do. Until the nineteenth century, this was not a problem, because labor was cheap. But in the twentieth century, technology transformed this situation. Technology did not make us worse at, say, hand-carving stone ornament, but it made us much better at other things, including virtually all kinds of manufacturing and many kinds of services. So the opportunity cost of hand-carving ornament rose. This effect was famously described by the economist William J Baumol in the 1960s, and in economics it is known as Baumol’s cost disease [see here].

To put this another way: since the labor of stone carvers was now far more productive if it was redirected to other activities, stone carvers could get higher wages by switching to other occupations, and could only be retained as stone carvers by raising their wages so much that stone carving became prohibitively expensive for most buyers. So although we didn’t get worse at stone carving, that wasn’t enough: we had to get better at it if it was to survive against stiffer competition from other productive activities. And so the labor-intensive ornament-rich styles faded away, to be replaced by sparser modern styles that could easily be produced with the help of modern technology. Styles suited to the age of handicrafts were superseded by the styles suited to the age of the machine. So, at least, goes the story.

This is what economists might call a supply-side explanation: it says that desire for ornament may have remained constant, but that output fell anyway because it became costlier to supply. One of the attractive features of the supply-side explanation is that it makes the stylistic transformation of the twentieth century seem much less mysterious. We do not have to claim that – somehow, astonishingly – a young Swiss trained as a clockmaker and a small group of radical German artists managed to convince every government and every corporation on Earth to adopt a radically novel and often unpopular architectural style through sheer force of ideas. In fact, the theory goes, cultural change was downstream of fairly obvious technical and economic forces. Something more or less like modern architecture was the inevitable result of the development of modern technology.

I like the supply-side theory, and I think it is elegant and clever. But my argument here will be that it is largely wrong. It is just not true that twentieth-century technology made ornament more expensive: in fact, new methods of production made many kinds of ornament much cheaper than they had ever been. Absent changes in demand, technology would have changed the dominant methods and materials for producing ornament, and it would have had some effect on ornament’s design. But it would not have resulted in an overall decline. In fact, it would almost certainly have continued the nineteenth-century tendency toward the democratization of ornament, as it became affordable to a progressively wider market. Like furniture, clothes, pictures, shoes, holidays, carpets, and exotic fruit, ornament would have become abundantly available to ordinary people for the first time in history.

In other words, something like the naive demand-side theory has been true all along: to exaggerate a little, it really did happen that every government and every corporation on Earth was persuaded by the wild architectural theory of a Swiss clockmaker and a clique of German socialists, so that they started wanting something different from what they had wanted in all previous ages. It may well be said that this is mysterious. But the mystery is real, and if we want to understand reality, it is what we must face…

And face it Hughes does: “The beauty of concrete,” from @SCP_Hughes in @WorksInProgMag.

Adolf Loos (architect and polemicist of modern architecture)

###

As we ponder plainness, we might send ornate birthday greetings to Sir Bertram Clough Williams-Ellis; he was born on this date in 1883. An architect who resisted the modernist trends of his time, he is best remembered as the creator of the Italianate village of Portmeirion in North Wales– the setting of the wonderful televisions series The Prisoner (and the Doctor Who arc The Masque of Mandragora).

Clough Williams-Ellis at Portmeirion in 1969 (source)

Written by (Roughly) Daily

May 28, 2024 at 1:00 am

“The materials of city planning are: sky, space, trees, steel, and cement; in that order and that hierarchy”*…

… problematically, the last of those is among the biggest sources of CO2 emissions on earth– between 7 and 8% of the total. Now, Casey Crownheart reports, there may be a way to produce that essential building material in a low- or no-carbon way…

Cement hides in plain sight—it’s used to build everything from roads and buildings to dams and basement floors. But there’s a climate threat lurking in those ubiquitous gray slabs. Cement production accounts for more than 7% of global carbon dioxide emissions—more than sectors like aviation, shipping, or landfills.

Humans have been making cement, in one form or another, for thousands of years. Ancient Romans used volcanic ash, crushed lime, and seawater to build the aqueducts and iconic structures like the Pantheon. The modern version of hydraulic cement—the sort that hardens when mixed with water and allowed to dry—dates back to the early 19th century. Derived from widely available materials, it’s cheap and easy to make. Today, cement is one of the most-used materials on the planet, with about 4 billion metric tons produced annually

Industrial-scale cement is a multifaceted climate conundrum. Making it is energy intensive: the inside of a traditional cement kiln is hotter than lava in an erupting volcano. Reaching those temperatures typically requires burning fossil fuels like coal. There’s also a specific set of chemical reactions needed to turn crushed-up minerals into cement—and those reactions release carbon dioxide, the most common greenhouse gas in the atmosphere.

One solution to this climate catastrophe might be coursing through the pipes at Sublime Systems. Founded by two MIT battery scientists, the startup is developing an entirely new way to make cement. Instead of heating crushed-up rocks in lava-hot kilns, Sublime’s technology zaps them in water with electricity, kicking off chemical reactions that form the main ingredients in its cement.

Over the course of the past several years, the startup has gone from making batches of cement that could fit in the palm of your hand to starting up a pilot facility that can produce around 100 tons each year. While it’s still tiny compared with traditional cement plants, which can churn out a million tons or more annually, the pilot line represents the first crucial step to proving that electrochemistry can stand up to the challenge of producing one of the world’s most important building materials.

By the end of the decade, Sublime plans to have a full-scale manufacturing facility up and running that’s capable of producing a million tons of material each year. But traditional large-scale cement plants can cost over a billion dollars to build and outfit. Competing with established industry players will require Sublime to scale fast while raising the additional funding it will need to support that growth. The end of 0% interest rates makes such a task increasingly difficult for any business, but especially for one producing a commodity like cement. And in a high-stakes, low-margin industry like construction, Sublime will need to persuade builders to use its material in the first place…

A start-up is working to drive down the carbon footprint of cement production: “How electricity could help tackle a surprising climate villain,” from @casey_crownhart in @techreview.

See also: “We are closing in on zero-carbon cement.”

* Le Corbusier

###

As we prioritize progress, we might note that it was on this date in 1942 that Henry Ford patented the Soybean car. Per Wikipedia:

… a concept car built with agricultural plastic. The New York Times in 1941 states the car body and fenders were made from a strong material derived from soy beans, wheat and corn. One article claims that they were made from a chemical formula that, among many other ingredients, included soy beans, wheat, hemp, flax and ramie; while the man who was instrumental in creating the car, Lowell E. Overly, claims it was “…soybean fiber in a phenolic resin with formaldehyde used in the impregnation” (Davis, 51). The body was lighter and therefore more fuel efficient than a normal metal body. It was made in Dearborn, Michigan and was introduced to public view on August 13, 1941. It was made, in part, as a hedge against the rationing of steel during World War II. It was designed to run on hemp fuel.

World’s first plastic car body (source)
Plastic car frame patent 2,269,452, January 13, 1942 (source)

“I thought about the screws and their happiness”*…

A contemplation of craft…

I clock my screws, meaning I orient the slot in the screw heads so they are all vertical or horizontal. But I don’t think it’s a mark of superior aesthetics. It’s just something I do, like lining up the silverware on the dining table just so. I can’t help it…

Yesterday I took a drive to Columbus, Ind., one of the country’s repositories of excellent post-war architecture. Check out the Wikipedia page. Or the NPR story on the town. Or the great Kogonada-directed movie, “Columbus.”

My favorite building we toured was the First Christian Church, designed by Eliel Saarinen. Considered one of the first modern church structures in America, the building offers nod after nod to the cathedrals and churches of Europe. Yet the building, completed during World War II, is a complete break with the Old World. Even after 75 years, the church feels a beacon of hope, optimism and light.

One of the prominent features of the interiors is the extensive wooden lattice work, which is affixed with tens of thousands of perfectly clocked screws…

The Church of the Clocked Screws,” from Christopher Schwarz (@RudeMechanic) at Lost Art Press— eminently worthy of reading in full.

* Haruki Murakami

###

As we applaud alignment, we might recall that it was on this date in 1852 that Cullen Whipple was awarded patent 9477 for his “Mechanism for Pointing and Threading Screw-Blanks in the Same Machine.” A machinist, Whipple had invented the first practical device for making pointed screws (a marked improvement on earlier screws, which were blunt-ended and required the drilling of “starter holes”).  Cullen joined with partners to incorporate The New England Screw Co., then went on to invent and patent seven other machines that improved the manufacture of screws.

 source

“They swore by concrete. They built for eternity”*…

Understanding how the materials we use work– and don’t work– together…

For most of a red swamp crayfish’s life, cambarincola barbarae are a welcome sight. Barbarae – whitish, leech-like worms, each a couple of millimeters long – eat the swamp scum off the crayfish’s shells and gills, and in most cases improve the crayfish’s health and life expectancy. Together, barbarae and crayfish form a mutualistic symbiotic relationship. Both species benefit from their cohabitation, and barbarae have evolved to the point where their entire life cycle, from egg to adult, occurs while attached to a crayfish.

But their symbiosis is contextual – a tentative truce. Young crayfish (who molt their shells more frequently and therefore accumulate less scum) don’t need much cleaning, and will take pains to remove barbarae from their shells. And even when molting has slowed and a crayfish has allowed the symbiosis to flourish, there are limits to barbarae’s loyalty: If there isn’t enough food for them to survive, they’ll turn parasitic, devouring their host’s gills and eventually killing them.

Like symbioses, composite materials can be incredibly productive: two things coming together to create something stronger. But like crayfish and barbarae, their outcomes can also be tragic. Rarely are two materials a perfect match for each other, and as the environment changes their relationship can turn destructive. And when composites turn destructive – as was evident in the reinforced concrete when the Champlain Towers North were inspected back in 2018 – the fallout can be catastrophic.

The history of what we now call composite materials goes back many thousands of years. For modern consumers, the most common composites are fiber-reinforced plastics (the colloquial “carbon fiber” and “fiberglass”), but perhaps the first composites in history were reinforced mud bricks. The Mesopotamians learned to temper their bricks by mixing straw into them at least as early as 2254 BC, increasing their tensile strength and preventing them from cracking as they dried. This method continues around the world today.

But by far the most commonly used composite material in history is steel-reinforced concrete. Roman concrete usage started as early as 200 BCE, and almost three centuries later Pliny the Elder included a note about what appears to be high quality hydraulic concrete in his Naturalis Historiae. These recipes were subsequently forgotten, and the material largely disappeared between the Pantheon and the mid nineteenth century. Modern concrete involves some legitimate process control: limestone and other materials are heated to around 900° C to create portland cement, which is then pulverized and mixed with water (and aggregate) to create an exothermic reaction resulting in a hard and durable object. The entire process consumes vast amounts of power and produces vast amounts of carbon dioxide, and the industry supporting it today is estimated to be worth about a half a trillion dollars.

But in spite of the fortunes that have been invested in the portland cement process (as well as in a wide range of concrete admixtures, which are used to tune both the wet mixture and the finished product), the true magic of contemporary concrete is the fact that it is so often reinforced with steel – dramatically increasing its tensile strength and making it suitable for a wide range of structural applications. This innovation arose in the mid-nineteenth century, when between 1848 and 1867 it was developed by three successive Frenchmen. In the late 1870s, around the time that the first reinforced concrete building was built in New York City, the American inventor Thaddeus Hyatt noted a critical quality of the material: through some fantastic luck, the coefficients of thermal expansion of steel and concrete are strikingly similar, allowing a composite steel-concrete structure to withstand warm/cool cycles without fracturing. This quality opened up the floodgates, and in the 1880s the pioneering architect-engineer Ernest Ransome built a string of reinforced concrete structures around the San Francisco Bay Area. From there it was history.

More than any other physical technology, it is reinforced concrete that defines the 20th century. Versatile, strong, and (relatively) durable, the material is critical to life and industry as we know it. Reinforced concrete was the material of choice of Albert Kahn, who with Henry Ford defined 20th century industrial architecture; reinforced concrete is a key part of  nearly every type of logistical infrastructure, from roads to bridges to container terminals; reinforced concrete makes up the literal launch pads for human space travel. It’s a critical component of power plants, dams, wind turbines, and the vast majority of mid- to late-twentieth century homes and apartment buildings. Its high compressive strength makes it ideally suited for footings and foundations; its high tensile strength lets it cantilever and span great distances easily.

But reinforced concrete is really only 140 years old – the blink of an eye, as far as the infrastructure of old is concerned. The Pantheon was built around 125 CE, by which time the Romans had been experimenting with concrete construction for well over 300 years. When we see the Pantheon, we’re seeing a mature method – a technology with full readiness, being used in an architectural style that’s tuned for its physical properties.

By contrast, even our most iconic steel-reinforced concrete buildings are prototypes…

Early on in the history of steel-reinforced concrete, it was known that the high alkalinity of concrete helped to inhibit the rebar from rusting. The steel was said to be sealed within a monolithic block, safe from the elements and passivated by its high pH surroundings; it would ostensibly last a thousand years. But atmospheric carbon dioxide inevitably penetrates concrete, reacting with lime to produce calcium carbonate – and lowering its pH. At that point, the inevitable cracks and fissures allow the rebar inside to rust, whereupon it expands dramatically, cracking the concrete further and eventually breaking the entire structure apart.

This process – carbonatation, followed by corrosion and failure – was often visible but largely ignored into the late twentieth century. Failures in reinforced concrete structures were often blamed on shoddy construction, but the reality is that like the crayfish and the barbarae, the truce between concrete and steel is tentative. What protection concrete offers steel is slowly eaten away by carbonatation, and once it’s gone the steel splits the concrete apart from the inside…

There are of course many potential innovations to come in reinforced concrete. Concrete mixtures made with fly ash and slag produce high strength and durable structures. Rebar rust can be mitigated by using sacrificial anodes or impressed current. Rebar can be made of more weather resistant materials like aluminum bronze and fiberglass. Or the entire project could be scrapped – after all, the CO2 emitted by the cement industry is nothing to thumb your nose at. Whatever we do, we should remember that the materials we work with are under no obligation to get along with one another – and that a symbiotic truce today doesn’t necessarily mean structural integrity tomorrow.

On composites, crayfish, and reinforced concrete’s tentative alkalinity: “A Symbiotic Truce,” from Spencer Wright (@pencerw), whose newsletter, “The Prepared” (@the_prepared), is always an education.

* Gunter Grass

###

As we delve into durability, we might recall that it was on this date in 315 that the Arch of Constantine officially opened. A triumphal arch in Rome dedicated to the emperor Constantine the Great, it was constructed of Roman concrete, faced with brick, and reveted in marble.

Roman concrete, like any concrete, consists of an aggregate and hydraulic mortar – a binder mixed with water (often sea water) that hardens over time. The aggregate varied, and included pieces of rock, ceramic tile, and brick rubble from the remains of previously demolished buildings. Gypsum and quicklime were used as binders, but volcanic dusts, called pozzolana or “pit sand”, were favored where they could be obtained. Pozzolana makes the concrete more resistant to salt water than modern-day concrete.

The strength and longevity of Roman marine concrete is understood to benefit from a reaction of seawater with a mixture of volcanic ash and quicklime to create a rare crystal called tobermorite, which may resist fracturing. As seawater percolated within the tiny cracks in the Roman concrete, it reacted with phillipsite naturally found in the volcanic rock and created aluminous tobermorite crystals. The result is a candidate for “the most durable building material in human history.” In contrast, as Wright notes above, modern concrete exposed to saltwater deteriorates within decades.

source

“a total space, a complete world, a kind of miniature city”*…

John Portman’s Atlanta Hyatt Regency, which opened in 1967, kicked off a major atrium-hotel-building craze

If you’re craning your neck as severely when you step inside a building as you did outside it, you might be in an atrium hotel, an intensely American structure for sleep, conferences, cocktails, and much more. These are facilities built around a massive central chamber stretching a dozen or several dozen stories into the sky; at the lobby level, you’ll find bars, restaurants, gardens, live birds, and maybe even a boat or two.

We don’t build them much anymore, but Americans invented, perfected and exported this unique building style to the world (where it continues to prosper). Birthed in brash excess, atrium hotels were first seen as too gaudy by the modernist architectural establishment and as too profligate by penny-pinching chain hoteliers. To varying observers, they suggest everything from Disney to dystopia. But in their heyday, these buildings promised — and delivered — a spectacle like no other.

Real estate developer Trammell Crow, the man with the most Dallas-sounding name you’ve ever heard, provided early inspiration for the form with his Dallas Trade Mart atrium, built in 1958. But it was Atlanta architect-developer John Portman, his occasional partner, who adapted and built the form into a colossus. Portman’s Hyatt Recency Atlanta opened in 1967, and was an immediate sensation. Atriums became a signature of the Hyatt Regency brand, and Portman went on to work for a variety of other chains, including Marriott and Westin. Atriums later became a standard feature of most Embassy Suites…

The benefit wasn’t just grand views from the lobby, but from every floor; each hallway was suddenly a balcony. Inside that central volume of space, hotels stuffed a range of embellishments. “One would move through a set of functions and experiences as one might a city: from home, to garden, to urban plaza, cafe, and bar,” wrote University of Technology, Sydney architectural historian Charles Rice in his book Interior Urbanism: Architecture, John Portman, and Downtown America.

The trouble was, some critics saw, that these atrium hotels tended to be creating, as Rice’s title indicates, a new urbanism that was purely inside. Amenities that once faced streets were pulled indoors and replaced with blank walls and hard-to-find entrances. That formula — so irresistible during an era of urban crisis and decay in the 1970s and ’80s — lost some appeal when cities staged a comeback and the streets again beckoned with their own attractions…

Portman’s first atrium wasn’t in a hotel at all, but in the now-demolished Antoine Graves public housing tower in Atlanta, built in 1965. The idea was simple, says Mickey Steinberg, a structural engineer on many of Portman’s early projects. The architect was just trying to provide some sociable space and ventilation to tenants. (The building was not air conditioned.) “If I had a hole down the center of the building,” Steinberg recalls Portman saying, “people could come out and talk to each other and I might be able to get some air through the building.” 

That notion recurred to Portman two years later for the Hyatt Regency. “It wasn’t any grand philosophy about a style of architecture,” Steinberg says. “He was designing for people to want to be there.”

He was also designing for people who might not have wanted to be in Atlanta, whose central business district was in decline. Steinberg recalled Portman’s intention: “I’m going to create a space for them to want to be in, because downtown Atlanta doesn’t have it anymore.”

The Portman-style skyscraper atrium revived a 19th century tradition: the grand hotel lobby, with its adjoining restaurants, ballrooms and other such attractions. In the motel age, these spaces had often been pared back to a mere desk for paperwork. (You’d even usually go elsewhere for that one ineradicable amenity of the ice machine.) Portman bet that guests would embrace spectacle and activity again…

The atrium concept didn’t initially enthrall the moneymen… Bill Marriott had one look and he said, ‘Don’t bother with it. Motels are the thing.’” Conrad Hilton famously called it a “concrete monster.” A then-unknown savior turned up in the form of Don Pritzker, whose nascent Hyatt chain then had only three locations. 

That bet paid off once the Hyatt Regency Atlanta opened: Visits to the hotel in the first four months of operation exceeded their expectation of the first five years. Guests lined up just to go up and down in the glass elevators. And Hyatt ran with the formula, building additional atrium-equipped Regency locations into the 1970s and ’80s…

A consideration of a uniquely-American style and of the social, cultural, and economic forces that birthed it: “Into the Heart of the Atrium Hotel.”

* Frederic Jameson, describing Portman’s Bonaventure Hotel in Los Angeles, Postmodernism

###

As we blow bubbles, we might recall that it was on this date in 1928 that former concert violinist and proprietor of the One-In-Hand Tie Company of Clinton, Iowa, Joseph W. Less, introduced the modern clip-on tie.

source

Written by (Roughly) Daily

December 13, 2020 at 1:01 am