(Roughly) Daily

Posts Tagged ‘Alexander Graham Bell

“A commodity appears at first sight an extremely obvious, trivial thing. But its analysis brings out that it is a very strange thing”*…

Line graph depicting price changes of selected US consumer goods and services from January 2000 to June 2022, highlighting categories becoming more expensive and those becoming more affordable.

Prices are on everyone’s minds these days. Brian Potter looks underneath the costs of the finished products and services that we typically track to examine the costs of the commodities that go into them…

This American Enterprise Institute chart [above], which breaks down price changes for different types of goods and services in the consumer price index, has by now become very widely known. A high-level takeaway from this chart is that labor-intensive services (education, healthcare) get more expensive in inflation-adjusted terms over time, while manufactured goods (TVs, toys, clothing) get less expensive over time.

But there are many types of goods that aren’t shown on this chart. One example is commodities: raw (or near-raw) materials mined or harvested from the earth. Commodities have many similarities with manufactured goods: they’re physical things that are produced (or extracted) using some sort of production technology (mining equipment, oil drilling equipment), and many of them will go through factory-like processing steps (oil refineries, blast furnaces). But commodities also seem distinct from manufactured goods. For one, because they’re often extracted from the earth, commodities can be subject to depletion dynamics: you run out of them at one location, and have to go find more somewhere else. In my book I talk about how iron ore used to be mined from places like Minnesota, but as the best deposits were mined out steel companies increasingly had to source their ore from overseas. And the idea of “Peak Oil” is based on the idea that society will use up the easily accessible oil, and be forced to obtain it from increasingly marginal, expensive-to-access locations.

(Some commodities, particularly agricultural commodities that can be repeatedly grown on a plot of land, don’t have the same sort of depletion dynamics, though bad farming practices can degrade a plot of land over time. Other commodities get naturally replenished over time, but can still get used up if the rate of extraction exceeds the rate of replenishment; non-farmed timber harvesting and non-farmed commercial fishing come to mind as examples.)

Going into this topic, I didn’t have a great sense of what price trends look like for commodities in general. Julian Simon famously won a 1980 bet with Paul Ehrlich that several raw materials — copper, chromium, nickel, tin, and tungsten — would be cheaper (in inflation-adjusted terms) after 10 years, not more expensive. But folks have pointed out that if the bet had been over a different 10-year window, Ehrlich would have won the bet.

To better understand how price tends to change for different commodities and raw materials, I looked at historical prices for over a hundred different commodities. Broadly, agricultural commodities tend to get cheaper over time, while fossil fuels have a slight tendency to get more expensive. Minerals (chemicals, metals, etc.) have a slight tendency towards getting cheaper, with a lot of variation — 15 minerals more than doubled in price over their respective time series. But this has shifted over the last few decades, and recently there’s been a greater tendency for commodities to rise in price…

[Potter offers a thorough– and fascinating– analysis, concluding…]

… historically commodities have generally fallen in price over time, but recently this trend has increasingly shifted towards rising prices. Natural gas and oil got cheaper until the 1950s and the 1970s, respectively, and since then have gotten more expensive. Beef and pork both got cheaper from 1970 until the 1990s, and since then have risen in price. Agricultural products were almost uniformly falling in price until around 2000, and have almost uniformly risen in price since then.

My general sense looking at historical commodity price data is that the more that production of some commodity looks like manufacturing — produced by a repetitive process that can be steadily improved and automated, from a supply that can be scaled up in a relatively straightforward fashion, without being subject to severe depletion dynamics — the more you’ll tend to see prices fall over time. The biggest decline in price of any commodity I looked at is industrial diamonds, which fell in price by 99.9% between 1900 and 2021d ue to advances in lab-grown diamonds production. This effectively replaced mined diamonds with manufactured ones for industrial uses; roughly 99% of industrial diamonds today are synthetic. Many other commodities had major price declines that were the result of production process improvements — aluminum got cheaper thanks to the invention (and subsequent improvements) of the Hall-Heroult smelting process, titanium’s price declined following the introduction of the Kroll process, and so on. (Steel also got much cheaper following the introduction of the Bessemer process, but that predates USGS price data.) And of course agriculture, which has evolved from crops being harvested manually to being harvested with highly automated, continuous process machinery, closely mirrors the sorts of process improvements we see in manufacturing.

Of course, this trend alone can’t explain changes in commodity prices over time, and there are plenty of commodities — steel, cement, silicon — that are produced in a manufacturing-type operation but which haven’t seen substantially declining prices over their history. And even commodities which resemble manufactured goods have risen in price recently. More generally, there are plenty of things that can shift supply and demand curves to the right or left: cartels, national policies, a spike or collapse in demand, and so on. But the question of “how much, over time, does the production of this commodity resemble a manufacturing process?” seems like a useful lens on understanding the dynamics of commodity prices…

Do Commodities Get Cheaper Over Time?” from @constructionphysics.skystack.xyz.

* Karl Marx

###

As we brush up on the basics, we might recall that this date in the anniversary of two events that spurred commodity consumption.

Alexander Graham Bell spurred a boom on copper consumption when, on this date in 1915, he placed the first transcontinental phone call, from New York to San Francisco, where the Panama–Pacific International Exposition celebrations were underway and his assistant, his assistant Thomas Augustus Watson stood by. Bell repeated his famous first telephonic words, “Mr. Watson, come here. I want you,” to which Watson this time replied “It will take me five days to get there now!” Bell’s call officially initiated AT&T’s transcontinental service.

A sepia-toned historical photograph of a panel of seven men seated at a long table, dressed in formal attire, with a dignified backdrop featuring dark curtains and a portrait hanging above. The setting appears to be a formal meeting or assembly, likely from the early 20th century.
Alexander Graham Bell, about to call San Francisco from New York. (source)

And, on this date 45 years later, in 1959, the aluminum market got a boost when the first non-stop transcontinental commercial jet trip was made by an American Airlines Boeing 707, from Los Angeles to New York. The sleek silver plane made the flight in airline official time of 4 hours and 3 minutes, half the usual scheduled time for the prop-driven DC- 7Cs then in regular use on that route.

A vintage jet airliner flying above the clouds, featuring a silver and orange color scheme.

source

Written by (Roughly) Daily

January 25, 2026 at 1:00 am

“‘When I use a word,’ Humpty Dumpty said in rather a scornful tone, ‘it means just what I choose it to mean — neither more nor less.'”*…

A Renaissance portrait of Erasmus of Rotterdam, depicting him in profile while writing in a book with a quill pen, set against a dark green background.
Portrait of Erasmus of Rotterdam Writing (1523) by Hans Holbein

Like today’s large language models, some 16th-century humanists (like Erasmus) had techniques to automate writing. But as Hannah Katznelson explains, others (like Rabelais) called foul…

The Renaissance scholar and educator Erasmus of Rotterdam opens his polemical treatise The Ciceronian (1528) by describing the utterly dysfunctional writing process of a character named Nosoponus. The Ciceronianis structured as a dialogue, withtwo mature writers, Bulephorus and Hypologus, trying to talk Nosoponus out of his paralysing obsession with stylistic perfection. Nosoponus explains that it would take him weeks of fruitless writing and rewriting to produce a casual letter in which he asks a friend to return some borrowed books. He says that writing requires such intense concentration that he can do it only at night, when no one else is awake to distract him, and even then his perfectionism is so intense that a single sentence becomes a full night’s work. Nosoponus goes over what he’s written again and again, but remains so dissatisfied with the quality of his language that eventually he just gives up.

Nosoponus’s problem might resonate. Who has not spent too long going over the wording of a simple email, at some point or another? Today there is an easy fix: we have large language models (LLMs) to write our letters for us, helpfully proffering suggestions as to what we might say, and how we might phrase it. When I input Nosoponus’s intended request into GPT-4, it generated the following almost instantly:

Hey [Friend’s Name],

Hope you’re doing well! I just realised I never got those books back that I lent you a while ago. No rush, but whenever you get a chance, I’d love to get them back. Let me know what works for you! Thanks!

Nosoponus

But there was a solution in the 16th century, too. A humanist education on the Erasmian model could train its students to produce letters of any length, on any topic – quickly, easily and eloquently. The French humanist François Rabelais, a contemporary of Erasmus, appears to have understood these compositional techniques as automating the creating of text in a way that, retrospectively, looks a lot like how LLMs function. If we want to understand LLMs, and what they are and aren’t capable of, we can look at earlier versions of the same technology – like Erasmian humanism. We can also read authors like Rabelais, who is already thinking about automatic text-generation along these lines, as someone who appreciates the effectiveness of Erasmian generative technology, but at the same time sees it as vitiating the social force of language and, ultimately, ruining language as a tool for moral and political life…

[Katznelson recounts Erasmus’s efforts, Rabelais’s response, and unpacks the important differences between our own authentic speech language created to speak for us and their practical and moral implications…]

What lessons from the 16th century can tell us about AI and LLMs: “Methodical banality,” from @aeon.co.

* Lewis Carroll, Through the Looking Glass

###

As we honor authenticity, we might recall that it was on this date in 1886 that three U.S. patents were issued to Alexander Graham Bell’s Volta Labs for “recording and reproducing speech and other sounds.” The Graphophone, was an improved (and the first practical) version of the Edison phonograph (from 1877), and became the foundation on which the speech recording (e.g., dictaphone) and recorded music (and spoken word) industries began to grow.

An illustration of an early speech recording device, the Graphophone, showcasing its intricate components and design.

source

“Badger hates Society, and invitations, and dinner, and all that sort of thing”*…

Americans are now spending more time alone– mostly at home– than ever. It’s changing our personalities, our politics, and even our relationship to reality. While the pandemic certainly enforced some of that isolation; the post-COVID world remains extraordinarily atomized.

In a bracing essay, Derek Thompson, explores the emergence of this wide-spread isolation, unpacking its drivers, enumerating its considerable (personal and civic) costs, musing on the possible impact of AI, and pondering what might lead to a return to sociability…

… “I have a view that is uncommon among social scientists, which is that moral revolutions are real and they change our culture,” Robert Putnam [author of Bowling Alone] told me. In the early 20th century, a group of liberal Christians, including the pastor Walter Rauschenbusch, urged other Christians to expand their faith from a narrow concern for personal salvation to a public concern for justice. Their movement, which became known as the Social Gospel, was instrumental in passing major political reforms, such as the abolition of child labor. It also encouraged a more communitarian approach to American life, which manifested in an array of entirely secular congregations that met in union halls and community centers and dining rooms. All of this came out of a particular alchemy of writing and thinking and organizing. No one can say precisely how to change a nation’s moral-emotional atmosphere, but what’s certain is that atmospheres do change. Our smallest actions create norms. Our norms create values. Our values drive behavior. And our behaviors cascade.

The anti-social century is the result of one such cascade, of chosen solitude, accelerated by digital-world progress and physical-world regress. But if one cascade brought us into an anti-social century, another can bring about a social century. New norms are possible; they’re being created all the time. Independent bookstores are booming—the American Booksellers Association has reported more than 50 percent growth since 2009—and in cities such as New York City and Washington, D.C., many of them have become miniature theaters, with regular standing-room-only crowds gathered for author readings. More districts and states are banning smartphones in schools, a national experiment that could, optimistically, improve children’s focus and their physical-world relationships. In the past few years, board-game cafés have flowered across the country, and their business is expected to nearly double by 2030. These cafés buck an 80-year trend. Instead of turning a previously social form of entertainment into a private one, they turn a living-room pastime into a destination activity. As sweeping as the social revolution I’ve described might seem, it’s built from the ground up by institutions and decisions that are profoundly within our control: as humble as a café, as small as a new phone locker at school…

On how we spend our time and what that yields: “The Anti-Social Century,” from @dkthomp.bsky.social in @theatlantic.com (gift article).

See also: “You’re Being Alienated From Your Own Attention,” from @chrislhayes.bsky.social (also in @theatlantic.com, also a gift article)

* Kenneth Grahame, The Wind in the Willows

###

As we call a friend, we might recall that it was on this date in 1915 that Alexander Graham Bell placed the first transcontinental phone call, from New York to San Francisco, where the Panama–Pacific International Exposition celebrations were underway and his assistant, his assistant Thomas Augustus Watson stood by. Bell repeated his famous first telephonic words, “Mr. Watson, come here. I want you,” to which Watson this time replied “It will take me five days to get there now!” Bell’s call officially initiated AT&T’s transcontinental service.

Alexander Graham Bell, about to call San Francisco from New York. (source)

And, on ths date 45 years later, in 1959, The first non-stop transcontinental commercial jet trip was made by an American Airlines Boeing 707, from Los Angeles to New York. The sleek silver plane made the flight in airline official time of 4 hours and 3 minutes, half the usual scheduled time for the prop-driven DC- 7Cs then in regular use on that route.

source

“You are the music while the music lasts”*…

Composer (and Stanford professor) Jonathan Berger explains how music works its magic on our brains…

One evening, some 40 years ago, I got lost in time. I was at a performance of Schubert’s String Quintet in C major. During the second movement I had the unnerving feeling that time was literally grinding to a halt. The sensation was powerful, visceral, overwhelming. It was a life-changing moment, or, as it felt at the time, a life-changing eon.

It has been my goal ever since to compose music that usurps the perceived flow of time and commandeers the sense of how time passes. Although I’ve learned to manipulate subjective time, I still stand in awe of Schubert’s unparalleled power. Nearly two centuries ago, the composer anticipated the neurological underpinnings of time perception that science has underscored in the past few decades.

The human brain, we have learned, adjusts and recalibrates temporal perception. Our ability to encode and decode sequential information, to integrate and segregate simultaneous signals, is fundamental to human survival. It allows us to find our place in, and navigate, our physical world. But music also demonstrates that time perception is inherently subjective—and an integral part of our lives. “For the time element in music is single,” wrote Thomas Mann in his novel, The Magic Mountain. “Into a section of mortal time music pours itself, thereby inexpressibly enhancing and ennobling what it fills.”

We conceive of time as a continuum, but we perceive it in discretized units—or, rather, as discretized units. It has long been held that, just as objective time is dictated by clocks, subjective time (barring external influences) aligns to physiological metronomes. Music creates discrete temporal units but ones that do not typically align with the discrete temporal units in which we measure time. Rather, music embodies (or, rather, is embodied within) a separate, quasi-independent concept of time, able to distort or negate “clock-time.” This other time creates a parallel temporal world in which we are prone to lose ourselves, or at least to lose all semblance of objective time.

In recent years, numerous studies have shown how music hijacks our relationship with everyday time…

The fascinating story of “How Music Hijacks Our Perception of Time,” in @NautilusMag.

* T. S. Eliot

###

As we tangle with tempo, we might spare a thought for Charles Sumner Tainter; he died on this date in 1940. A scientific instrument maker, engineer, and inventor, he is best known for his collaborations with Alexander Graham Bell, and for his significant improvements to Thomas Edison’s phonograph, resulting in the Graphophone— which, beyond bringing music to living rooms around the world by making Edison’s idea commercially feasible, also spawned the Dictaphone.

source

Written by (Roughly) Daily

April 20, 2023 at 1:00 am

“Progress means getting nearer to the place you want to be. And if you have taken a wrong turn, then to go forward does not get you any nearer.”*…

Earlier (Roughly) Daily posts have looked at “Progress Studies” and at its relationship to the Rationalism community. Garrison Lovely takes a deeper look at this growing and influential intellectual movement that aims to understand why human progress happens – and how to speed it up…

For most of history, the world improved at a sluggish pace, if at all. Civilisations rose and fell. Fortunes were amassed and squandered. Almost every person in the world lived in what we would now call extreme poverty. For thousands of years, global wealth – at least our best approximations of it – barely budged.

But beginning around 150-200 years ago, everything changed. The world economy suddenly began to grow exponentially. Global life expectancy climbed from less than 30 years to more than 70 years. Literacy, extreme poverty, infant mortality, and even height improved in a similarly dramatic fashion. The story may not be universally positive, nor have the benefits been equally distributed, but by many measures, economic growth and advances in science and technology have changed the way of life for billions of people.

What explains this sudden explosion in relative wealth and technological power? What happens if it slows down, or stagnates? And if so, can we do something about it? These are key questions of “progress studies”, a nascent self-styled academic field and intellectual movement, which aims to dissect the causes of human progress in order to better advance it.

Founded by an influential economist and a billionaire entrepreneur, this community tends to define progress in terms of scientific or technological advancement, and economic growth – and therefore their ideas and beliefs are not without their critics. So, what does the progress studies movement believe, and what do they want to see happen in the future?

Find out at: “Do we need a better understanding of ‘progress’?,” from @GarrisonLovely at @BBC_Future.

Then judge for yourself: was Adorno right? “It would be advisable to think of progress in the crudest, most basic terms: that no one should go hungry anymore, that there should be no more torture, no more Auschwitz. Only then will the idea of progress be free from lies.” Or can–should– we be more purposively, systemically ambitious?

* C. S. Lewis

###

As we get better at getting better, we might recall that it was on this date in 1922 that the United States paid tribute to a man instrumental in the progress that Progress Studies is anxious to sustain, Alexander Graham Bell…

There were more than 14 million telephones in the United States by the time Alexander Graham Bell died. For one minute on August 4, 1922, they were all silent.

The reason: Bell’s funeral. The American inventor was the first to patent telephone technology in the United States and who founded the Bell Telephone System in 1877. Though Bell wasn’t the only person to invent “the transmission of speech by electrical wires,” writes Randy Alfred for Wired, achieving patent primacy in the United States allowed him to spend his life inventing. Even though the telephone changed the world, Bell didn’t stop there.

Bell died on August 2, 1922, just a few days after his 75th birthday. “As a mark of respect every telephone exchange in the United States and Canada closed for a minute when his funeral began around 6:30 p.m. Eastern Standard Time,” Alfred writes.

On the day of the funeral, The New York Times reported that Bell was also honored by advocates for deaf people. “Entirely apart from the monumental achievement of Professor Bell as the inventor of the telephone, his conspicuous work in [sic] behalf of the deaf of this country would alone entitle him to everlasting fame,” said Felix H. Levey, president of the Institution for the Improved Instruction of Deaf Mutes.

In fact, Bell spent much of his income from the telephone on helping deaf people. The same year he founded the Bell Telephone System, 1880, Bell founded the Volta Laboratory. The laboratory, originally called Volta Associates, capitalized on Bell’s work and the work of other sound pioneers. It made money by patenting new innovations for the gramophone and other recorded sound technologies. In 1887, Bell took his share of the money from the sale of gramophone patents and founded the Volta Bureau “as an instrument for the increase and diffusion of knowledge relating to the Deaf,’” writes the National Park Service. Bell and Volta continued to work for deaf rights throughout his life.

Volta Laboratory eventually became Bell Laboratories, which was home to many of the twentieth century’s communication innovations.

Smithsonian

source