(Roughly) Daily

Posts Tagged ‘Food

“Food is an important part of a balanced diet”*…

In your quest to eat right, are you an a nutritionist or an essentialist?

Nutrition science began with the chemical description of proteins, fats and carbohydrates in the 19th century. The field didn’t seem to hold much medical import; the research was mostly aimed at cheaply feeding poor and institutionalised people well enough to keep them from rioting. Germ theory, on the other hand, was new and revolutionary medical science, and microbiologists such as Louis Pasteur were demonstrating that one disease after another, from cholera to malaria to leprosy, was caused by microbes. But at the turn of the 20th century, nutrition science suddenly arrived as a major part of our understanding of human health…

In 1911, the Polish chemist Casimir Funk announced that he’d isolated the beriberi-preventing chemical, which he thought to be a molecule containing an amine group, and named it ‘vitamine’ – a vital amine. The next year, Funk published an ambitious paper and book arguing that not only beriberi but three other human diseases – scurvy, pellagra and rickets – were each caused by a lack of a particular vitamin. Within a few months, the English researcher Frederick Hopkins published the results of a series of experiments in which he fed animals diets based on pure proteins, carbohydrates and fats, after which they developed various ailments. He posited that the simplified diets lacked some ‘accessory food factors’ important for health. Those factors and many others were discovered over the next three decades, and researchers showed how these vitamins were critical to the function of practically every part of the body. Ten of those scientists, including Eijkman and Hopkins, won Nobel prizes. At the same time that physicists laid out the theories of general relativity and quantum mechanics, describing fundamental laws that governed the Universe on its smallest and largest scales, chemists discovered the laws that seemed to govern the science of nutrition.

[… which, over the last 100 years, has exploded…]

[Gyorgy Scrinis, a professor of food politics and policy at the University of Melbourne] argues that the field of nutrition science is under the sway of an ideology he dubbed ‘nutritionism’, a mode of thinking about food that makes a number of erroneous assumptions: it reduces foods to quantified collections of nutrients, pulls foods out of the context of diets and lifestyles, presumes that biomarkers such as body-mass index are accurate indicators of health, overestimates scientists’ understanding of the relationship between nutrients and health, and falls for corporations’ claims that the nutrients they sprinkle into heavily processed junk foods make them healthful. These errors lead us toward food that is processed to optimise its palatability, convenience and nutrient profile, drawing us away from the whole foods that Scrinis says we should be eating. He says the history of margarine provides a tour of the perils of nutritionism: it was first adopted as a cheaper alternative to butter, then promoted as a health food when saturated fat became a nutritional bugbear, later castigated as a nutritional villain riddled with trans fats, and recently reformulated without trans fats, using new processes such as interesterification. That has succeeded in making margarine look better, according to nutritionism’s current trends, but is another kind of ultra-processing that’s likely to diminish the quality of food….

While Scrinis cites the growing body of scientific research implicating modern food processing, he also supports his critique of nutritionism with appeals to intuition. ‘This idea that ultra-processed foods are degraded – we’ve always known this,’ he says. ‘Our senses tell us whole foods are wholesome. People know this intuitively. The best foods in terms of cuisine are made from whole foods, not McDonald’s. It’s common sense.’

Even as nutritionism pushes us to believe that the latest nutrition research reveals something important about food, we also hold on to a conflicting concept: the idea that natural foods are better for us in ways that don’t always show up in scientific studies – that whole foods contain an inherent essence that is despoiled by our harsh modern processing techniques. ‘It’s a general attitude that you can break foods down that is the problem,’ says Scrinis. ‘It’s showing no respect for the food itself.’ This idea of respecting food reveals an underlying perspective that is essentialist, which, in philosophy, is the Platonic view that certain eternal and universal characteristics are essential to identity. Science is usually thought of as the antithesis of our atavistic intuitions, yet nutrition science has contained an essentialist view of nutrition for almost a century.

Most of us carry both ideologies, essentialism and nutritionism, in our minds, pulling us in different directions, complicating how we make decisions about what to eat. This tension is also visible in nutrition. Many government public health agencies give precise recommendations, based on a century of hard research, for the amounts of every nutrient we need to keep us healthy. They also insist that whole foods, especially fruits and vegetables, are the best ways to get those nutrients. But if you accept the nutrient recommendations, why assume that whole foods are a better way of getting those nutrients than, say, a powdered mix that is objectively superior in terms of cost, convenience and greenhouse emissions? What’s more, powdered mixes make it far easier for people to know exactly what they’re eating, which addresses one problem that constantly vexes nutritionists.

This kind of reflexive preference for natural foods can sometimes blind us to the implications of science. Even as research piles up implicating, for instance, excessive sugar as a particular problem in modern diets, most nutrition authorities refuse to endorse artificial sweeteners as a way to decrease our sugar consumption. ‘I’ve spent a lot of time with artificial sweeteners, and I cannot find any solid evidence there’s anything wrong with including them in your diet,’ says Tamar Haspel, a Washington Post columnist who has been writing about nutrition for more than 20 years. She says there’s some evidence that low-calorie sweeteners help some people lose weight, but you won’t hear that from nutrition authorities, who consistently minimise the positives while focusing on potential downsides that have not been well-established by research, such as worries that they cause cancer or scramble the gut microbiome. Why the determined opposition? ‘Because artificial sweeteners check lots of the boxes of the things that wholesome eaters avoid. It’s a chemical that’s manufactured in a plant. It’s created by the big companies that are selling the rest of the food in our diet, a lot of which is junk.’ Haspel says that nutritionists’ attitude to low-calorie sweeteners is ‘puritanical, it’s holier-than-thou, and it’s breathtakingly condescending’. The puritanical response reflects the purity of essentialism: foods that are not ‘natural’ are not welcome in the diets of right-thinking, healthy-eating people…

Our arguments over food are so polarised because they are not only about evidence: they are about values. Our choice of what we put inside us physically represents what we want inside ourselves spiritually, and that varies so much from person to person. Hearn uses food, much of it from a blender, to hack his body and keep him well-fuelled between business meetings. Scrinis looks forward to spending time in his kitchen, tinkering with new varieties of sourdough packed with sprouted grains and seeds. Haspel lives in Cape Cod, where she grows oysters, raises chickens, and hunts deer for venison – and also drinks diet soda and uses sucralose in her smoothies and oatmeal, to help keep her weight down.

Nutritionism and essentialism provide comfortingly clear perspectives about what makes food healthful. But an open-minded look at the evidence suggests that many of the most hotly debated questions about nutrition are impossible to answer with the information we have, maybe with the information we will ever have in the foreseeable future. If we isolate nutrients and eat them in different forms than they naturally come in, how will they affect us? Can processed foods be made in ways to approach or even surpass the healthfulness of natural whole foods?…

Human bodies are so fascinating in part because they are so variable and malleable. Beyond some important universals, such as the vitamins discovered a century ago, different people’s bodies work differently, because of their genes, behaviours and environments. The food we eat today changes the way our bodies work tomorrow, making yesterday’s guidance out of date. There are too many variables and too few ways to control them…

Maybe the reason that diet is so difficult to optimise is that there is no optimal diet. We are enormously flexible omnivores who can live healthily on varied diets, like our hunter-gatherer ancestors or modern people filling shopping carts at globally sourced supermarkets, yet we can also live on specialised diets, like traditional Inuits who mostly ate a small range of Arctic animals or subsistence farmers who ate little besides a few grains they grew. Aaron Carroll, a physician in Indiana and a columnist at The New York Times, argues that people spend far too much time worrying about eating the wrong things. ‘The “dangers” from these things are so very small that, if they bring you enough happiness, that likely outweighs the downsides,’ he said in 2018. ‘So much of our food discussions are moralising and fear-inducing. Food isn’t poison, and this is pretty much the healthiest people have even been in the history of mankind. Food isn’t killing us.’

Food is a vehicle for ideologies such as nutritionism and essentialism, for deeply held desires such as connecting with nature and engineering a better future. We argue so passionately about food because we are not just looking for health – we’re looking for meaning. Maybe, if meals help provide a sense of meaning for your life, that is the healthiest thing you can hope for.

Vitamins or whole foods? high-fat or low-fat? sugar or sweetener?… Will we ever get a clear idea about what we should eat? “The Food Wars,” from Amos Zeeberg (@settostun)

[image above: source]

* Fran Lebowitz

###

As we scale the food pyramid, we might send birthday greetings in oyster sauce to Joyce Chen; she was born on this date in 1917.  A chef, restauranteur, author, television personality, and entrepreneur, she parlayed a successful Cambridge, MA restaurant (where she’s credited with creating the “all you can eat Chinese buffet” to perk up slow Tuesdays and Wednesdays) into a collection of restaurants, a cooking school, a series of cookbooks, and a PBS series (shot on the same set as Julia Child’s show).  She is credited with popularizing northern-style Chinese cuisine in America.  Chen was honored in 2014 (along with Julia Child) as one of the five chefs featured on a series of U.S. postage stamps.

 source

Written by (Roughly) Daily

September 14, 2021 at 1:00 am

“Now I luxuriously thrust for noble pickle”*…

The delicacy that delights…

Amerigo Vespucci didn’t discover the Americas, contrary to what the map-makers who named the continents believed, but his given name did end up lending itself to the so-called “new world.” And Ralph Waldo Emerson once called Vespucci “the pickle-dealer at Seville,” a derisive label that may have stretched the truth a bit, but pointed towards a very real part of the itinerant Italian’s biography.

Before traveling to the New World himself, Vespucci worked as a ship chandler—someone who sold supplies to seafaring merchants and explorers. These supplies included foods like meat, fish, and vegetables that had been pickled, which meant they would stay preserved beneath a ship’s deck for months. Without pickling, expeditions had to rely on dried foods and ingredients with naturally long shelf lives for sustenance. Much of the time, this limited diet wasn’t enough to provide crewmembers the nutrition they needed for the journey ahead. This made pickle sellers like Vespucci indispensable during the golden age of exploration. Vespucci even supplied Christopher Columbus’s later voyages across the Atlantic with his briny goods. So while he wasn’t the world’s most important explorer, Vespucci’s pickles may have changed history by preventing untold bouts of scurvy.

And pickles weren’t just enjoyed by 15th century sailors. From ancient Mesopotamia to New York deli counters, they’ve played a vital role in the global culinary scene. But where do pickles come from? How did the cucumber become the standard-issue pickling vegetable in the States? And what exactly is a pickle, anyway?…

The story of a humble but crucial comestible: “A Brief History of Pickles.”

Martial

###

As we dig in, we might spare a thought for Sylvester Graham; he died on this date in 1851. A Presbyterian minister, he preached primarily of the benefits of vegetarianism (and temperance). He urged the use only of whole, coarse grains– inspiring a host of graham flour, graham bread, and graham cracker products.

source

Written by (Roughly) Daily

September 11, 2021 at 1:00 am

“How can you govern a country which has 246 varieties of cheese”*…

Well, one strategy, embraced by dictators worldwide, is to declare one of them the official national cheese…

It always surprises me that more people don’t know that pad Thai was invented by a dictator. I don’t mean that the authoritarian prime minister of Thailand, Plaek Phibunsongkhram, got creative in the kitchen one day. But he made pad Thai—then an unknown noodle dish without a name—the country’s national dish by fiat.

Phibunsongkhram was a military officer who took power in a coup and liked to compare himself to Napoleon. Establishing pad Thai as Thailand’s official food was one of many reforms he pursued to unify the country under his leadership. And it was remarkably successful.

The Thai leader is not the only authoritarian who took an active interest in his country’s cuisine. When successful, dictators’ food obsessions can change how a country eats and drinks for generations. Here, we explore the fascinating but unnerving world of dictator food projects…

Authoritarian food obsessions can have a lasting legacy: “The Dictators Who Ruled Their Countries’ Cuisines,” from Alex Mayyasi (@amayyasi), with a Q&A with chef-turned-journalist Witold Szablowski, who published How to Feed a Dictator, a book that tells the story of five chefs who worked for five terrible rulers.

* Charles de Gaulle

###

As we contemplate comestible coercion, we might send comforting birthday greetings to Dorcas Lillian Bates Reilly; she was born on this date in 1926. A chef and inventor, she worked for many years in the test kitchen at the Campbell’s Soup Company– where she developed hundreds of recipes, including a tuna-noodle casserole and Sloppy Joe “souperburgers.” But she is best remembered for “the green bean bake”– or as it is better known, the green bean casserole— a holiday staple in tens of millions of households every year. While her recipe made good use of her employer’s Cream of Mushroom Soup, she believed that the French’s crispy fried onions were the “touch of genius” in the dish.

source

Written by (Roughly) Daily

July 22, 2021 at 1:00 am

“In a world of diminishing mystery, the unknown persists”*…

But what is it for?…

In the first episode of Buck Rogers, the 1980s television series about an astronaut from the present marooned in the 25th century, our hero visits a museum of the future. A staff member brandishes a mid-20th-century hair dryer. “Early hand laser,” he opines. As an observation of how common knowledge gets lost over time, it’s both funny and poignant. Because our museums also stock items from the past that completely baffle the experts.

Few are as intriguing as the hundred or so Roman dodecahedrons that we have found… In 1739, a strange, twelve-sided hollow object from Roman times was discovered in England. Since then, more than a hundred dodecahedrons have been unearthed.. We know next to nothing about these mysterious objects — so little, in fact, that the various theories about their meaning and function are themselves a source of entertainment.

So, what do we know?

Roman dodecahedrons — or more properly called Gallo-Roman dodecahedrons — are twelve-sided hollow objects, each side pentagonal in shape and almost always contain a hole. The outer edges generally feature rounded protrusions.

Most of the objects are made from bronze, but some are in stone and don’t have holes or knobs. The dodecahedrons are often fist-sized yet can vary in height from 4 to 11 cm (about 1.5 to 4.5 in). The size of the holes also varies, from 6 to 40 mm (0.2 to 1.5 in). Two opposing holes typically are of differing sizes.

Objects of this type were unknown until the first one was found in 1739 in Aston, Hertfordshire. In all, 116 have been dug up from sites as far apart as northern England and Hungary. But most have been found in Gaul, particularly in the Rhine basin, in what is now Switzerland, eastern France, southern Germany, and the Low Countries. Some were found in coin hoards, indicating their owners considered them valuable. Most can be dated to the 2nd and 3rd century AD.

No mention of the dodecahedrons from Roman times has survived. Any theory as to their function is based solely on speculation. Some suggestions:

• A specific type of dice for a game since lost to history. 

• A magical object, possibly from the Celtic religion. A similar small, hollow object with protrusions was recovered from Pompeii in a box with either jewellery or items for magic.

• A toy for children.

• A weight for fishing nets.

• The head of a chieftain’s scepter. 

• A kind of musical instrument. 

• A tool to estimate distances and survey land, especially for military purposes.

• An instrument to estimate the size of and distance to objects on the battlefield for the benefit of the artillery.

• A device for detecting counterfeit coins.

• A calendar for determining the spring and autumn equinoxes and/or the optimal date for sowing wheat.

• A candle holder. (Wax residue was found in one or two of the objects recovered.)

• A connector for metal or wooden poles. 

• A knitting tool specifically for gloves. (That would explain why no dodecahedrons were found in the warmer regions of the Empire.)

• A gauge to calibrate water pipes.

• A base for eagle standards. (Each Roman legion carried a symbolic bird on a staff into battle.) 

• An astrological device used for fortune-telling. (Inscribed on a dodecahedron found in Geneva in 1982 were the Latin names for the 12 signs of the zodiac.) 

The geographic spread of the dodecahedrons we know of is particular: they were all found in territories administered by Rome, inhabited by Celts. That enhances the theory that they were specific to Gallo-Roman culture, which emerged from the contact between the Celtic peoples of Gaul and their Roman conquerors.

Intriguingly, archaeologists in the 1960s have found similar objects along the Maritime Silk Road in Southeast Asia, except smaller and made of gold. They do not appear to predate the Gallo-Roman artefacts and may be evidence of Roman influence on the ancient Indochinese kingdom of Funan…

The first of many was unearthed almost three centuries ago, and we still don’t know what they were for: “Mysterious dodecahedrons of the Roman Empire.”

* Jhumpa Lahiri

###

As we ponder the puzzle, we might send compressed birthday greetings to Aaron “Bunny” Lapin; he was born on this date in 1914.  In 1948, Lapin invented Reddi-Wip, the pioneering whipped cream dessert topping dispensed from a spray can.   First sold by milkmen in St. Louis, the product rode the post-World War Two convenience craze to national success; in 1998, it was named by Time one of the century’s “100 great consumer items”– along with the pop-top can and Spam.  Lapin became known as the Whipped Cream King; but his legacy is broader:  in 1955, he patented a special valve to control the flow of Reddi-Wip from the can, and formed The Clayton Corporation to manufacture it.  Reddi-Wip is now a Con-Agra brand; but Clayton goes strong, now making industrial valves, closures, caulk, adhesives and foamed plastic products (like insulation and cushioning materials).

source

source

“Doing research on the Web is like using a library assembled piecemeal by pack rats and vandalized nightly”*…

But surely, argues Jonathan Zittrain, it shouldn’t be that way…

Sixty years ago the futurist Arthur C. Clarke observed that any sufficiently advanced technology is indistinguishable from magic. The internet—how we both communicate with one another and together preserve the intellectual products of human civilization—fits Clarke’s observation well. In Steve Jobs’s words, “it just works,” as readily as clicking, tapping, or speaking. And every bit as much aligned with the vicissitudes of magic, when the internet doesn’t work, the reasons are typically so arcane that explanations for it are about as useful as trying to pick apart a failed spell.

Underpinning our vast and simple-seeming digital networks are technologies that, if they hadn’t already been invented, probably wouldn’t unfold the same way again. They are artifacts of a very particular circumstance, and it’s unlikely that in an alternate timeline they would have been designed the same way.

The internet’s distinct architecture arose from a distinct constraint and a distinct freedom: First, its academically minded designers didn’t have or expect to raise massive amounts of capital to build the network; and second, they didn’t want or expect to make money from their invention.

The internet’s framers thus had no money to simply roll out a uniform centralized network the way that, for example, FedEx metabolized a capital outlay of tens of millions of dollars to deploy liveried planes, trucks, people, and drop-off boxes, creating a single point-to-point delivery system. Instead, they settled on the equivalent of rules for how to bolt existing networks together.

Rather than a single centralized network modeled after the legacy telephone system, operated by a government or a few massive utilities, the internet was designed to allow any device anywhere to interoperate with any other device, allowing any provider able to bring whatever networking capacity it had to the growing party. And because the network’s creators did not mean to monetize, much less monopolize, any of it, the key was for desirable content to be provided naturally by the network’s users, some of whom would act as content producers or hosts, setting up watering holes for others to frequent.

Unlike the briefly ascendant proprietary networks such as CompuServe, AOL, and Prodigy, content and network would be separated. Indeed, the internet had and has no main menu, no CEO, no public stock offering, no formal organization at all. There are only engineers who meet every so often to refine its suggested communications protocols that hardware and software makers, and network builders, are then free to take up as they please.

So the internet was a recipe for mortar, with an invitation for anyone, and everyone, to bring their own bricks. Tim Berners-Lee took up the invite and invented the protocols for the World Wide Web, an application to run on the internet. If your computer spoke “web” by running a browser, then it could speak with servers that also spoke web, naturally enough known as websites. Pages on sites could contain links to all sorts of things that would, by definition, be but a click away, and might in practice be found at servers anywhere else in the world, hosted by people or organizations not only not affiliated with the linking webpage, but entirely unaware of its existence. And webpages themselves might be assembled from multiple sources before they displayed as a single unit, facilitating the rise of ad networks that could be called on by websites to insert surveillance beacons and ads on the fly, as pages were pulled together at the moment someone sought to view them.

And like the internet’s own designers, Berners-Lee gave away his protocols to the world for free—enabling a design that omitted any form of centralized management or control, since there was no usage to track by a World Wide Web, Inc., for the purposes of billing. The web, like the internet, is a collective hallucination, a set of independent efforts united by common technological protocols to appear as a seamless, magical whole.

This absence of central control, or even easy central monitoring, has long been celebrated as an instrument of grassroots democracy and freedom. It’s not trivial to censor a network as organic and decentralized as the internet. But more recently, these features have been understood to facilitate vectors for individual harassment and societal destabilization, with no easy gating points through which to remove or label malicious work not under the umbrellas of the major social-media platforms, or to quickly identify their sources. While both assessments have power to them, they each gloss over a key feature of the distributed web and internet: Their designs naturally create gaps of responsibility for maintaining valuable content that others rely on. Links work seamlessly until they don’t. And as tangible counterparts to online work fade, these gaps represent actual holes in humanity’s knowledge…

The glue that holds humanity’s knowledge together is coming undone: “The Internet Is Rotting.” @zittrain explains what we can do to heal it.

(Your correspondent seconds his call to support the critically-important work of The Internet Archive and the Harvard Library Innovation Lab, along with the other initiatives he outlines.)

* Roger Ebert

###

As we protect our past for the future, we might recall that it was on this date in 1937 that Hormel introduced Spam. It was the company’s attempt to increase sales of pork shoulder, not at the time a very popular cut. While there are numerous speculations as to the “meaning of the name” (from a contraction of “spiced ham” to “Scientifically Processed Animal Matter”), its true genesis is known to only a small circle of former Hormel Foods executives.

As a result of the difficulty of delivering fresh meat to the front during World War II, Spam became a ubiquitous part of the U.S. soldier’s diet. It became variously referred to as “ham that didn’t pass its physical,” “meatloaf without basic training,” and “Special Army Meat.” Over 150 million pounds of Spam were purchased by the military before the war’s end. During the war and the occupations that followed, Spam was introduced into Guam, Hawaii, Okinawa, the Philippines, and other islands in the Pacific. Immediately absorbed into native diets, it has become a unique part of the history and effects of U.S. influence in the Pacific islands.

source

%d bloggers like this: