(Roughly) Daily

Posts Tagged ‘invention

“Like so many named places.. it was less an identifiable city than a grouping of concepts– census tracts, special purpose bond-issue districts, shopping nuclei, all overlaid with access roads to its own freeway”*…

Dallas-Fort Worth has one of the world’s most extensive urban freeway systems. It is the product of the pro-growth ambition of political and business leaders, and has empowered the ambition of real estate developers, big business, the technology industry and entrepreneurs. The North Texas cultural spirit to think big and build big has guided the ongoing growth and expansion of Dallas-Fort Worth freeways, a transportation system which has propelled North Texas to be among the most economically successful regions in the United States in the post-World War II era. Dallas-Fort Worth Freeways documents the origins, politics, influence and resulting urban landscape of North Texas freeways…

The very complete– and lavishly illustrated– history of the Dallas-Fort area’s motorways: “Dallas-Fort Worth Freeways.”

See also the same author’s equally remarkable “Houston Freeways.”

* Thomas Pynchon, The Crying of Lot 49

###

As we watch for our exit, we might send motile birthday greetings to Nicolas-Joseph Cugnot, he was born on this date in 1725.  In 1769, Cugnot, a military engineer, invented the world’s first fuel-propelled vehicle–a gun tractor commissioned by the French government.  The following year he produced the first mechanically-driven “horseless carriage”; his steam tricycle, driven by a steam engine, carried four passengers and was the forerunner of the modern motor car.

There are reports of a minor incident in 1771, when the second prototype vehicle is said to have accidentally knocked down a brick or stone wall, either that or a Paris garden or part of the Paris Arsenal walls, in perhaps the first known automobile accident.

Nicolas-Joseph Cugnot, monument à Void (Lorraine)

source

Written by LW

September 25, 2020 at 1:01 am

“The greatest value of a picture is when it forces us to notice what we never expected to see”*…

Detail from Richard Waller’s “Tabula colorum physiologica …” [Table of physiological colours], from Philosophical Transactions, 1686 — Source.

One of the most demanding challenges for early modern scientists was devising how best to visually portray their discoveries to the public. In the absence of any sort of technology for automatic visualisation, like cameras or scanners, the sixteenth- and seventeenth-century natural philosopher had to rely on drawings and subsequently woodcuts, etchings, or engravings to turn an experimental finding into a reproducible and publicly accessible demonstration. This was a laborious, expensive, time-consuming, and often problematic operation. Negotiated between several parties involved in the world of image-making, such as draughtsmen, engravers, and printers, the results were inevitably compromises between the intentions of the researcher and the possibilities of the printing press. For example, what a drawing could express with shading, washing, and chromatic nuances, printed illustrations could only approximate through a binary system of black and white, resulting from the pressure of an inked copper plate against a page.

The problem of efficient imaging was particularly felt during the early years of the Royal Society, a scientific institution founded in London in the early 1660s and today still regarded as one of the most prestigious institutions of scientific research in the world. In its early decades of activity, the Royal Society established itself as one of the central forces of the Scientific Revolution, with renowned members such as Robert Boyle and Isaac Newton. Members of the Society used to meet on a weekly basis to discuss ongoing research on a variety of subjects, such as physics, mathematics, biology, astronomy, mechanics, geography, and antiquarianism.

Soon after its foundation, the Royal Society sought new ways to increase visibility and maximise its public reach. From this emerged the Philosophical Transactions, a monthly peer-reviewed journal, the first of its kind, featuring extracts from the Royal Society’s weekly research meetings. Founded in 1665 by the Society’s Secretary Henry Oldenburg and still published to this day, the Transactions are regarded as the first and longest-running scientific journal in history, as contributions were the result of original explorative studies into natural and mechanical matters informed by the Society’s culture of experiment — part of what today we generally call science.

The Transactions were printed in small quarto format (about 17x22cm) with up to about a dozen articles per issue and could be purchased for the price of one shilling, about £5 today. The journal was a pioneering learned publication, with exceptional frequency and aimed at a diverse public of curious researchers. As such, especially in the early years, its contributors were often preoccupied with how best to communicate their ideas and discoveries through the immediacy of mass-producible visual media. A closer look into a selection of these articles demonstrates the extent to which natural philosophers were prepared to re-invent the production and consumption of images with new and often odd strategies for representing the world. This was a process of endless hands-on experimentation, often pushing beyond the traditional confines of the printing house…

From infographics to digital renders, today’s scientists have ready access to a wide array of techniques to help visually communicate their research. It wasn’t always so: “‘More Lively Counterfaits’– Experimental Imaging at the Birth of Modern Science.”

* John Tukey

###

As we “show don’t tell,” we might spare a thought for Earle Dickson; he died on this date in 1961.  Dickson, concerned that his wife, Josephine Knight, often cut herself while doing housework and cooking, devised a way that she could easily apply her own dressings.  He prepared ready-made bandages by placing squares of cotton gauze at intervals along an adhesive strip and covering them with crinoline.  In the event, all his wife had to do was cut off a length of the strip and wrap it over her cut.  Dickson, who worked as a cotton buyer at Johnson & Johnson, took his idea to his employer… and the Band-Aid was born.

 source

“Corn is a greedy crop, as farmers will tell you”*…

With harvest season nigh, the corn should be “as high as an elephant’s eye,” as the old Rodgers and Hammerstein song has it. Time to get that water boiling for the corn on the cob?

Sure. But most of the corn we consume isn’t on the cob, in a can, or frozen. Sweet corn is actually less than 1 percent of the corn grown in the United States. Popcorn, our third standard type of corn, is also less than 1 percent of the American corn crop.

Educators Renee Clary and James Wandersee have explored the history of corn, from the first domestication of maize about 10,000 years ago to today’s ubiquitous “commodity corn.”

According to Clary and Wandersee, we should be flush with the hundreds of varieties of corn crafted by human selection over the centuries. But the United States, the world’s largest corn producer, almost exclusively grows field corn, which is also known as dent corn, or, in the futures markets, “commodity corn.” It is not delicious with butter and salt.

Commodity corn is a bit magical, however, because it can be transformed into a plethora of products. Food is still a big part of the corn equation, but indirectly. Meat, for example, is actually transmuted corn: Four-tenths of the U.S. corn crop goes to feed chicken, pigs, and cattle. And since cattle evolved as grass eaters, they have to be dosed with antibiotics because corn isn’t healthy for them.

Many processed foods are built on the back of corn. High-fructose corn syrup, much cheaper than sugar, is the most obvious of these corn-based ingredients in our food system…

Clary and Wandersee have their students survey what’s in their home pantries. There’s a lot of corn hidden in food labels. Caramel color, lecithin, citric acid, modified and unmodified starch? Corn. The same with ascorbic acid, lysine, dextrose, monoglycerides, diglycerides, maltose, maltodextrin, and MSG. Xanthan gum? Well, there’s no such thing as a xanthan gum tree.

But the corn story is even more complicated. Corn can be chemically manipulated into all sorts of unexpected uses. There’s ethanol, for instance, which is basically alcohol used as a fuel supplement.

You’ll also find corn used in the production of antibiotics; aspirin; books; charcoal briquettes; cosmetics; crayons; disposable diapers; drywall; dyes and inks; fireworks; glues; paper, and plastics. The spray cleaner Windex has at least five corn-derived components. Spark plugs, toothpaste, batteries, and running shoes can all be made with things that started out as corn, in a field, under the sun. In 2001, Goodyear introduced tires made with a starch-based filler made from corn. DuPont naturally has a corn-based synthetic fiber.

While we’re at it, Stephen King’s 1977 story “Children of the Corn” gave rise to a movie franchise starting in 1984, and there’s a new Children of the Corn movie scheduled for release next year. Corny as horror movies can be, we apparently can’t get enough of them—or of corn itself.

… all of which goes to show why, as Michael Pollan says “farmers facing lower prices have only one option if they want to be able to maintain their standard of living, pay their bills, and service their debt, and that is to produce more [corn].” Indeed, corn is the most widely grown grain crop throughout the Americas; 13 billion bushels of corn are grown in the United States alone each year (on 91.7 million acres, the largest crop in the U.S.).

The history of corn, from the domestication of maize 10,000 years ago to today’s ubiquitous “commodity corn,” to teach about biodiversity… and its lack: “Corn is everywhere!

* Michael Pollan

###

As we give a shuck, we might send artificially-sweetened birthday greetings to Oliver Evans; he was born on this date in 1755. An engineer and one of the most prolific and influential inventors in the early years of the United States, he was a pioneer in the fields of automation, materials handling, and steam power. He created the first continuous production line (the first fully automated industrial process), the first high-pressure steam engine, and the first (albeit crude) amphibious vehicle and American automobile.

But given the subject of today’s post, we might note that he also created the automatic corn mill.

source

Written by LW

September 13, 2020 at 1:01 am

“Imperial is lit, but Metric is liter”…*

 

barleycorn

 

The English-speaking world has been famously (and, many argue, problematically) slow to switch to the metric system of measurement.  One of the reasons is the way in which traditional “imperial” measures are baked into our understanding of products and services we use every day.

Consider the barleycorn, which is still used as the basis of shoe sizes in English-speaking countries.

In ancient Rome, the inch (which was one twelfth of a foot) measured the width across the (interphalangeal) joint of the thumb. By the 7th century in England, the barleycorn became a standard measurement with three ears of corn, laid end to end, equalling one inch. It took until the thirteenth century before the inch was officially sanctioned. Under pressure, Edward II (r. 1307-27) eventually succumbed to appeals from scholars and tradesmen to issue a decree to standardise measurement (Ledger, 1985).

Henceforth an English inch was the distance measured across three barleycorns. Thirty nine (39) barleycorns laid end to end became a foot, and 117 laid end to end became a yard. Whilst the barleycorn decree of Edward II had nothing to do with shoe sizes per se, many shoemakers began to use shoe sticks. Tradesmen had traditionally used the hand span method of measurement, which preferred the quarter of an inch unit, but after the introduction of the barleycorn measure, many began to adopt the third of an inch unit. With 39 barley corns approximating the length of a normal foot this was graded Size 13 and became the largest shoe size. Other sizes were graded down by 1/3 rd of an inch or one barleycorn…  [source]

The barleycorn is but one of the old English measures that. more or less obviously, still shape our encounters with and experience of the world:

406px-English_Length_Units_Graph.svg

Forgotten, but not gone: the barleycorn.

* bad joke

###

As we muse on measurement, we might send inventive birthday greetings to Charles Franklin Kettering; he was born on this date in 1876.  An engineer, businessman, and inventor (the holder of 186 patents), he worked at National Cash Register (where he created the first electric cash register with an electric motor that opened the drawer), co-founded DELCO, and was head of research at General Motors from 1920 to 1947.  He invented the key-operated self-starting motor and developed several new engine types, quick-drying lacquer finishes, anti-knock fuels, and variable-speed transmissions.  In association with the DuPont Chemical Company, he was also responsible for the invention of Freon refrigerant for refrigeration and air conditioning systems.  While working with the Dayton-Wright Company he developed the “Bug” aerial torpedo, considered the world’s first aerial missile.  In 1927, he founded the Kettering Foundation, a non-partisan research foundation devoted to answering the question: “what does it take for democracy to work as it should?”

220px-Time-magazine-cover-charles-kettering source

 

“Moore’s Law is really a thing about human activity, it’s about vision, it’s about what you’re allowed to believe”*…

 

Karen-fungal-computing-2

 

In moments of technological frustration, it helps to remember that a computer is basically a rock. That is its fundamental witchcraft, or ours: for all its processing power, the device that runs your life is just a complex arrangement of minerals animated by electricity and language. Smart rocks. The components are mined from the Earth at great cost, and they eventually return to the Earth, however poisoned. This rock-and-metal paradigm has mostly served us well. The miniaturization of metallic components onto wafers of silicon — an empirical trend we call Moore’s Law — has defined the last half-century of life on Earth, giving us wristwatch computers, pocket-sized satellites and enough raw computational power to model the climate, discover unknown molecules, and emulate human learning.

But there are limits to what a rock can do. Computer scientists have been predicting the end of Moore’s Law for decades. The cost of fabricating next-generation chips is growing more prohibitive the closer we draw to the physical limits of miniaturization. And there are only so many rocks left. Demand for the high-purity silica sand used to manufacture silicon chips is so high that we’re facing a global, and irreversible, sand shortage; and the supply chain for commonly-used minerals, like tin, tungsten, tantalum, and gold, fuels bloody conflicts all over the world. If we expect 21st century computers to process the ever-growing amounts of data our culture produces — and we expect them to do so sustainably — we will need to reimagine how computers are built. We may even need to reimagine what a computer is to begin with.

It’s tempting to believe that computing paradigms are set in stone, so to speak. But there are already alternatives on the horizon. Quantum computing, for one, would shift us from a realm of binary ones and zeroes to one of qubits, making computers drastically faster than we can currently imagine, and the impossible — like unbreakable cryptography — newly possible. Still further off are computer architectures rebuilt around a novel electronic component called a memristor. Speculatively proposed by the physicist Leon Chua in 1971, first proven to exist in 2008, a memristor is a resistor with memory, which makes it capable of retaining data without power. A computer built around memristors could turn off and on like a light switch. It wouldn’t require the conductive layer of silicon necessary for traditional resistors. This would open computing to new substrates — the possibility, even, of integrating computers into atomically thin nano-materials. But these are architectural changes, not material ones.

For material changes, we must look farther afield, to an organism that occurs naturally only in the most fleeting of places. We need to glimpse into the loamy rot of a felled tree in the woods of the Pacific Northwest, or examine the glistening walls of a damp cave. That’s where we may just find the answer to computing’s intractable rock problem: down there, among the slime molds…

It’s time to reimagine what a computer could be: “Beyond Smart Rocks.”

(TotH to Patrick Tanguay.)

* “Moore’s Law is really a thing about human activity, it’s about vision, it’s about what you’re allowed to believe. Because people are really limited by their beliefs, they limit themselves by what they allow themselves to believe about what is possible.”  – Carver Mead

###

As we celebrate slime, we might send fantastically far-sighted birthday greetings to Hugo Gernsback, a Luxemborgian-American inventor, broadcast pioneer, writer, and publisher; he was born on this date in 1884.

Gernsback held 80 patents at the time of his death; he founded radio station WRNY, was involved in the first television broadcasts, and is considered a pioneer in amateur radio.  But it was as a writer and publisher that he probably left his most lasting mark:  In 1926, as owner/publisher of the magazine Modern Electrics, he filled a blank spot in his publication by dashing off the first chapter of a series called “Ralph 124C 41+.” The twelve installments of “Ralph” were filled with inventions unknown in 1926, including “television” (Gernsback is credited with introducing the word), fluorescent lighting, juke boxes, solar energy, television, microfilm, vending machines, and the device we now call radar.

The “Ralph” series was an astounding success with readers; and later that year Gernsback founded the first magazine devoted to science fiction, Amazing Stories.  Believing that the perfect sci-fi story is “75 percent literature interwoven with 25 percent science,” he coined the term “science fiction.”

Gernsback was a “careful” businessman, who was tight with the fees that he paid his writers– so tight that H. P. Lovecraft and Clark Ashton Smith referred to him as “Hugo the Rat.”

Still, his contributions to the genre as publisher were so significant that, along with H.G. Wells and Jules Verne, he is sometimes called “The Father of Science Fiction”; in his honor, the annual Science Fiction Achievement awards are called the “Hugos.”

(Coincidentally, today is also the birthday– in 1906– of Philo T. Farnsworth, the man who actually did invent television… and was thus the inspiration for the name “Philco.”)

[UPDATE- With thanks to friend MK for the catch:  your correspondent was relying on an apocryphal tale in attributing the Philco brand name to to Philo Farnsworth.  Farsworth did work with the company, and helped them enter the television business.  But the Philco trademark dates back to 1919– pre-television days– as a label for what was then the Philadelphia Storage Battery Company.]

Gernsback, wearing one of his inventions, TV Glasses

source

 

 

%d bloggers like this: