Posts Tagged ‘progress’
“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it”*…
It’s very hard, historian of science Benjamin Breen explains, to understand the implications of a scientific revolution as one is living through it…
2023 is shaping up to be an important year in the history of science. And no, I’m not talking about the reputed room-temperature semiconductor LK-99, which seems increasingly likely to be a dud.
Instead, I’m talking about the discoveries you’ll find in Wikipedia’s list of scientific advances for 2023. Here are some examples:
• January: Positive results from a clinical trial of a vaccine for RSV; OpenAI’s ChatGPT enters wide use.
• February: A major breakthrough in quantum computing; announcement of a tiny robot that can clean blood vessels; more evidence for the ability of psychedelics to enhance neuroplasticity; major developments in biocomputers.
• March: OpenAI rolls out GPT-4; continued progress on mRNA vaccines for cancer.
• April: NASA announces astronaut crew who will orbit the moon next year; promising evidence for gene therapy to fight Alzheimer’s.
• May: Scientists use AI to translate brain activity into written words; promising results for a different Alzheimer’s drug; human pangenome sequenced (largely by a team of UCSC researchers — go Banana Slugs!); more good news about the potential of mRNA vaccines for fighting cancer.
And skipping ahead to just the past two weeks:
• nuclear fusion ignition with net energy gain was achieved for the second time
• a radical new approach to attacking cancer tumors entered Phase 1 trials in humans
• and — announced just as I was writing this [in August, 2023] — one of the new crop of weight loss drugs was reported to cut rates of heart attack and stroke in high-risk individuals by 20% (!).
Also in January of 2023: the New York Times asked “What Happened to All of Science’s Big Breakthroughs?”
The headline refers to an article published in Nature which argues that there has been a steady drop in “disruptive” scientific and technological breakthroughs between the years of 1945 and 2010. Basically, it’s a restatement of the concept of a “Great Stagnation” which was proposed by the economist Tyler Cowen in 2011. Though the paper cites everyone from Cowen to Albert Einstein and Isaac Newton, it’s worth noting that it doesn’t cite a single historian of science or technology (unless Alexandre Koyré counts)…
Naturally, as a historian of science and medicine, I think that there really are important things to learn from the history of science and medicine! And what I want to argue for the rest of this post boils down to two specific lessons from that history:
- People living through scientific revolutions are usually unaware of them — and, if they are, they don’t think about them in the same way that later generations do.
- An apparent slowdown in the rate of scientific innovation doesn’t always mean a slowdown in the impacts of science. The history of the first scientific revolution — the one that began in the famously terrible seventeenth century — suggests that the positive impacts of scientific innovation, in particular, are not always felt by the people living throughthe period of innovation. Periods when the pace of innovation appears to slow down may also be eras when society becomes more capable of benefitting from scientific advances by learning how to mitigate previously unforeseen risks.
[… There follows a fascinating look back at the 1660s– the “original” scientific revolution– at Boyle, Newton, at what they hoped/expected, and at how that differed for what their work and that of their colleagues actually yielded. Then the cautionary tale of Thomas Midgley..]
As we appear to be entering a new era of rapid scientific innovation in the 2020s, it is worth remembering that it often takes decades before the lasting social value of a technical innovation is understood — and decades more before we understand its downsides.
In the meantime, I’m pretty psyched about the cancer drugs…
As Thomas Kuhn observed, “The historian of science may be tempted to exclaim that when paradigms change, the world itself changes with them.”
On the difficulty of knowing the outcomes of a scientific revolution from within it: “Experiencing scientific revolutions: the 1660s and the 2020s,” from @ResObscura.
* Max Planck
###
As we try to see, we might spare a thought for William Seward Burroughs; he died on this date in 1898. And inventor who had worked in a bank, he invented the world’s first commercially viable recording adding machine and pioneered of its manufacture. The very successful company that he founded went on to become Unisys, which was instrumental in the development of computing… the implications of which we’re still discovering– and Burroughs surely never saw.
Nor, one reckons, did he imagine that his grandson, William Seward Burroughs II, would become the cultural figure that he did.
“There is often a decades-long time lag between the development of powerful new technologies and their widespread deployment”*…
Jerry Neumann explores the relevance of Carlota Perez‘s thinking (her concept of Techno-Economic Paradigm Shifts and theory of great surges, which built on Schumpeter’s work on Kondratieff waves) to the socio-economic moment in which we find ourselves…
I’ve been in the technology business for more than thirty years and for most of that time it’s felt like constant change. Is this the way innovation progresses, a never-ending stream of new things?
If you look at the history of technological innovation over the course of decades or centuries, not just years, it looks completely different. It looks like innovation comes in waves: great surges of technological development followed by quieter periods of adaptation.
The past 240 years have seen four of these great surges and the first half of a fifth…
Economist Carlota Perez in her 2002 book Technological Revolutions and Financial Capital puts forward a theory that addresses the causes of these successive cycles and tries to explain why each cycle has a similar trajectory of growth and crisis. Her answers lie not just in technological change, but in the social, institutional, and financial aspects of our society itself…
Perez’ theory divides each cycle into two main parts: the installation period and the deployment period. Installation is from irruption to the crisis, and deployment is after the crisis. These are the ying and the yang of the cycle. Some of the differences between the two periods we’ve already mentioned—creative destruction vs. creative construction, financial capital vs. production capital, the battle of the new paradigm with the old vs. acceptance of the new TEP, etc…
We like theory because it tells us why, but more than that, a good theory is predictive. If Perez’ theory is correct, it should allow us to predict what will happen next in the current technological cycle…
A crisp distillation of Perez’s thinking and a provocative consideration of its possible meaning for our times: “The Age of Deployment,” from @ganeumann.
* Carlota Perez (@CarlotaPrzPerez)
###
As we ride the waves, we might recall that it was on this date in 1901, 11 years after the suicide of Vincent Van Gogh (and as his vision and its impact flowered in its “Deployment Age”) a large retrospective of his work (71 paintings) was held at the Bernheim-Jeune Gallery. It captured the excitement of André Derain and Maurice de Vlaminck— and thus contributed to the emergence of Fauvism.

“On the one hand the computer makes it possible in principle to live in a world of plenty for everyone, on the other hand we are well on our way to using it to create a world of suffering and chaos. Paradoxical, no?”*…
Joseph Weizenbaum, a distinguished professor at MIT, was one of the fathers of artificial intelligence and computing as we know it; he was also one of his earliest critics– one whose concerns remain all too current. After a review of his warnings, Librarian Shipwreck shares a still-relevant set of questions Weizenbaum proposed…
At the end of his essay “Once more—A Computer Revolution” which appeared in the Bulletin of the Atomic Scientists in 1978, Weizenbaum concluded with a set of five questions. As he put it, these were the sorts of questions that “are almost never asked” when it comes to this or that new computer related development. These questions did not lend themselves to simple yes or no answers, but instead called for serious debate and introspection. Thus, in the spirit of that article, let us conclude this piece not with definitive answers, but with more questions for all of us to contemplate. Questions that were “almost never asked” in 1978, and which are still “almost never asked” in 2023. They are as follows:
• Who is the beneficiary of our much-advertised technological progress and who are its victims?
• What limits ought we, the people generally and scientists and engineers particularly, to impose on the application of computation to human affairs?
• What is the impact of the computer, not only on the economies of the world or on the war potential of nations, etc…but on the self-image of human beings and on human dignity?
• What irreversible forces is our worship of high technology, symbolized most starkly by the computer, bringing into play?
• Will our children be able to live with the world we are here and now constructing?
As Weizenbaum put it “much depends on answers to these questions.”
Much still depends on answers to these questions.
Eminently worth reading in full: “‘Computers enable fantasies’ – on the continued relevance of Weizenbaum’s warnings,” from @libshipwreck.
See also: “An island of reason in the cyberstream – on the life and thought of Joseph Weizenbaum.”
* Joseph Weizenbaum (1983)
###
As we stay grounded, we might spare a thought for George Stibitz; he died on this date in 1995. A Bell Labs researcher, he was known for his work in the 1930s and 1940s on the realization of Boolean logic digital circuits using electromechanical relays as the switching element– work for which he is internationally recognized as one of the fathers of the modern digital computer.
In 1937, Stibitz, a scientist at Bell Laboratories built a digital machine based on relays, flashlight bulbs, and metal strips cut from tin-cans. He called it the “Model K” because most of it was constructed on his kitchen table. It worked on the principle that if two relays were activated they caused a third relay to become active, where this third relay represented the sum of the operation. Then, in 1940, he gave a demonstration of the first remote operation of a computer.
“Progress means getting nearer to the place you want to be. And if you have taken a wrong turn, then to go forward does not get you any nearer.”*…
Earlier (Roughly) Daily posts have looked at “Progress Studies” and at its relationship to the Rationalism community. Garrison Lovely takes a deeper look at this growing and influential intellectual movement that aims to understand why human progress happens – and how to speed it up…
For most of history, the world improved at a sluggish pace, if at all. Civilisations rose and fell. Fortunes were amassed and squandered. Almost every person in the world lived in what we would now call extreme poverty. For thousands of years, global wealth – at least our best approximations of it – barely budged.
But beginning around 150-200 years ago, everything changed. The world economy suddenly began to grow exponentially. Global life expectancy climbed from less than 30 years to more than 70 years. Literacy, extreme poverty, infant mortality, and even height improved in a similarly dramatic fashion. The story may not be universally positive, nor have the benefits been equally distributed, but by many measures, economic growth and advances in science and technology have changed the way of life for billions of people.
What explains this sudden explosion in relative wealth and technological power? What happens if it slows down, or stagnates? And if so, can we do something about it? These are key questions of “progress studies”, a nascent self-styled academic field and intellectual movement, which aims to dissect the causes of human progress in order to better advance it.
Founded by an influential economist and a billionaire entrepreneur, this community tends to define progress in terms of scientific or technological advancement, and economic growth – and therefore their ideas and beliefs are not without their critics. So, what does the progress studies movement believe, and what do they want to see happen in the future?
Find out at: “Do we need a better understanding of ‘progress’?,” from @GarrisonLovely at @BBC_Future.
Then judge for yourself: was Adorno right? “It would be advisable to think of progress in the crudest, most basic terms: that no one should go hungry anymore, that there should be no more torture, no more Auschwitz. Only then will the idea of progress be free from lies.” Or can–should– we be more purposively, systemically ambitious?
* C. S. Lewis
###
As we get better at getting better, we might recall that it was on this date in 1922 that the United States paid tribute to a man instrumental in the progress that Progress Studies is anxious to sustain, Alexander Graham Bell…
There were more than 14 million telephones in the United States by the time Alexander Graham Bell died. For one minute on August 4, 1922, they were all silent.
The reason: Bell’s funeral. The American inventor was the first to patent telephone technology in the United States and who founded the Bell Telephone System in 1877. Though Bell wasn’t the only person to invent “the transmission of speech by electrical wires,” writes Randy Alfred for Wired, achieving patent primacy in the United States allowed him to spend his life inventing. Even though the telephone changed the world, Bell didn’t stop there.
Bell died on August 2, 1922, just a few days after his 75th birthday. “As a mark of respect every telephone exchange in the United States and Canada closed for a minute when his funeral began around 6:30 p.m. Eastern Standard Time,” Alfred writes.
On the day of the funeral, The New York Times reported that Bell was also honored by advocates for deaf people. “Entirely apart from the monumental achievement of Professor Bell as the inventor of the telephone, his conspicuous work in [sic] behalf of the deaf of this country would alone entitle him to everlasting fame,” said Felix H. Levey, president of the Institution for the Improved Instruction of Deaf Mutes.
In fact, Bell spent much of his income from the telephone on helping deaf people. The same year he founded the Bell Telephone System, 1880, Bell founded the Volta Laboratory. The laboratory, originally called Volta Associates, capitalized on Bell’s work and the work of other sound pioneers. It made money by patenting new innovations for the gramophone and other recorded sound technologies. In 1887, Bell took his share of the money from the sale of gramophone patents and founded the Volta Bureau “as an instrument for the increase and diffusion of knowledge relating to the Deaf,’” writes the National Park Service. Bell and Volta continued to work for deaf rights throughout his life.
Volta Laboratory eventually became Bell Laboratories, which was home to many of the twentieth century’s communication innovations.
Smithsonian
“Monetary policy is one of the most difficult topics in economics. But also, I believe, a topic of absolutely crucial importance for our prosperity.”*…

What can we learn from a twentieth century economist who was a critic of Keynes and a staunch advocate of the Gold Standard? Samuel Gregg considers the career of Jacques Rueff…
Money, it is often said, makes the world go round. The inverse of that axiom is that monetary disorder brings chaos in its wake. As we learned from the hyperinflation that wreaked havoc in 1920s Germany and the stagflation which hobbled Western economies throughout the 1970s, the effects of such disorder go far beyond the economy. Further complicating the problem is that restoring monetary stability is invariably a painful exercise, often bringing unemployment, recession and lasting social damage in its wake.
As a rule, monetary theory and monetary policy are dry affairs, dominated by highly technical discussions concerning topics such as the nature of capital or the likely impact of interest-rates set by central banks. One thinker who did not conform to this mould was the French monetary theorist Jacques Rueff (1896-1978). Arguably France’s most important twentieth-century economist, Rueff played a major role in shaping the Third Republic’s response to the Great Depression in the 1930s, designed the market liberalisation programme that saved France from economic collapse in 1958, and emerged in the 1960s as the leading critic of the US dollar’s role in the global economy and a prominent advocate of a return to the classic gold standard.
Rueff was, however, much more than an economist. A graduate of the École Polytechnique, he was among that small elite of civil servants trained in administration, engineering, mathematics, the natural sciences, foreign languages, and political economy whose role was to inject stability into the perpetual political pandemonium of the Third Republic. But even among that highly-educated cohort, Rueff stood out for the breadth and depth of his knowledge and his willingness to integrate it into his economic reflections. For Rueff, the significance of monetary order went beyond issues such as economic growth or employment, as important as they were. Ultimately, it was about whether Western civilisation flourished or embraced self-delusion…
Gregg recounts Rueff’s career, his championing of “real rights” (e.g., property rights) vs. “false rights” (which involve the state declaring something such as unemployment benefits to be a right and then trying to realize it through means that destroy real rights), and his advocacy of a return to the Gold Standard (part of his critique of the use of the U.S. dollar as a unit of reserve)… all positions with which reasonable people (including your correspondent) might disagree. But Gregg reminds us that Rueff’s most fundamental goal– a healthy society– surely remains desirable, and that his fear of the chaos that monetary meltdowns can cause is only too justified…
Monetary order wasn’t everything for Rueff. His writings reflect deep awareness of the ways in which culture, religion, philosophy, music and literature influenced civilisational development. Nonetheless Rueff insisted the threats posed by monetary disorder were more than economic. For him, civilisational growth was impossible without monetary order…
Let us not allow means with which we disagree to obscure important ends.
After examining the economic chaos of the early twentieth century, monetary theorist Jacques Rueff argued that without monetary order, civilizational growth is impossible: “Jacques Rueff’s quest for monetary order,” from @DrSamuelGregg in @EngelsbergIdeas.
* Maxime Bernier
###
As we remember that neither should we allow ends with which we disagree to obscure important means, we might spare a thought for Leonid Kantorovich; he died on this date in 1986. An economist and mathematician best known for his theory and development of techniques for the optimal allocation of resources, he is regarded as the founder of linear programming— for which he received the Nobel Memorial Prize in Economic Sciences in 1975.
You must be logged in to post a comment.