Posts Tagged ‘Isaac Newton’
“Great minds think alike”*…

Brian Potter on the (perhaps surprising) frequency with which “heroic” inventors are in fact better understood as the winners of close races…
When Alexander Graham Bell filed a patent for the telephone on February 14th, 1876, he beat competing telephone developer Elisha Gray to the patent office by just a few hours. The resulting legal dispute between Bell Telephone and Western Union (which owned the rights to Gray’s invention) would consume millions of dollars before being resolved in Bell’s favor in 1879.
Such cases of multiple invention are common, and some of the most famous and important modern inventions were invented in parallel. Both Thomas Edison and Joseph Swan patented incandescent lightbulbs in 1880. Jack Kilby and Robert Noyce patented integrated circuits in 1959. Hans von Ohain and Frank Whittle independently invented the jet engine in the 1930s. In a 1922 paper, William Ogburn and Dorothy Thomas documented 150 cases of multiple discovery in science and technology. Robert Merton found 261 examples in 1961, and observed that the phenomenon of multiple discovery was itself a multiple discovery, having been described over and over again since at least the early 19th century.
But exactly how common is multiple invention? The frequency of examples suggests that it can’t be particularly rare, but that doesn’t tell us the rate at which it occurs. In “How Common is Independent Discovery?,” Matt Clancy catalogues several attempts to estimate the frequency of multiple discovery, and tentatively comes up with a frequency of around 2-3% for simultaneous scientific discoveries, and perhaps an 8% chance that a given invention will be reinvented in the next decade. But the evidence for inventions is somewhat inconsistent, and varies greatly between studies. Clancy estimates a reinvention rate of around 8% per decade, but another study he found that looked at patent interference lawsuits between 1998 and 2014 suggests an independent invention rate of only around 0.02% per year.
The frequency of multiple invention is a useful thing to know, because it can give us clues about the nature of technological progress. A very low rate of multiple invention suggests that progress might be driven by a small number of “genius” inventors (what we might call the Great Man Theory of technological progress), and that it might be highly historically contingent (if you re-rolled the dice of history, maybe you get a totally new set of inventions and a different technological palette). A high rate of multiple invention suggests that progress is more a function of broad historical forces (that inventions appear when the conditions are right), and that progress is less contingent (if you re-rolled the dice of history, you’d get a similar progression of inventions). And if the rate of multiple invention is changing over time, perhaps the nature of technological progress is changing as well…
[Potter reviews the history and concludes that “multiple invention was extremely common”…]
… My main takeaway is that the ideas behind inventions are often in some sense “obvious,” or at least not so surprising or unexpected that many people won’t think of them. In some cases, this is probably because once some new possibility comes along, lots of people think of similar things that could be done with it. Once the properties of electricity began to be understood, many people came up with the idea of using it to send signals (telephone, telegraph), or to create motion (engines and generators), or to generate light (arc lamps, incandescent lights). Once the steam engine came along, lots of people had the idea to use it to power various types of vehicles.
In other cases, multiple invention probably occurs because important problems will attract many people trying to solve them. Steel corrosion was a large problem inspiring many folks to look for ways to create a steel that didn’t rust, or notice the potential value if they stumbled across such a material. Lamps causing mine fires were a major problem, inspiring many people to come up with ideas for safety lamps. The smoke produced by gunpowder was a major problem, inspiring many efforts to develop smokeless powders. And because would-be inventors will all draw from the same pool of available technologies, materials, and capabilities when coming up with a solution, there will be a large degree of convergence in the solutions they come up with…
Fascinating: “How Common is Multiple Invention?” from @const-physics.blogsky.venki.dev.
* common idiom
###
As we reconsider credit, we might recall that it was on this date in 1661 that Isaac Newton— a key figure in the Scientific Revolution and the Enlightenment that followed– entered Trinity College, Cambridge. Soon after Newton obtained his BA degree at Cambridge in August 1665, the university temporarily closed as a precaution against the Great Plague. Although he had been undistinguished as a Cambridge student, his private studies and the years following his bachelor’s degree have been described as “the richest and most productive ever experienced by a scientist.”
Relevantly to the piece above, Newton was party to a dispute with Gottfried Wilhelm Leibniz (who started, at age 14, at the University of Leipzig the same year that Newton matriculated at Cambridge) over which of them developed calculus– called “the greatest advance in mathematics that had taken place since the time of Archimedes.” The modern consensus is that the two men independently developed their ideas.

“Truth is ever to be found in the simplicity, and not in the multiplicity and confusion of things”*…
From Kim (Scott) Morrison‘s and Dror Bar-Natan‘s, The Knot Atlas, “a complete user-editable knot atlas, in the wiki spirit of Wikipedia“– a marvelous example of a wide-spread urge in mathematics to find order through classification. As Joseph Howlett explains, that quest continues, even as it proves vexatious…
Biology in the 18th century was all about taxonomy. The staggering diversity of life made it hard to draw conclusions about how it came to be. Scientists first had to put things in their proper order, grouping species according to shared characteristics — no easy task. Since then, they’ve used these grand catalogs to understand the differences among organisms and to infer their evolutionary histories. Chemists built the periodic table for the same purpose — to classify the elements and understand their behaviors. And physicists made the Standard Model to explain how the fundamental particles of the universe interact.
In his book The Order of Things, the philosopher Michel Foucault describes this preoccupation with sorting as a formative step for the sciences. “A knowledge of empirical individuals,” he wrote, “can be acquired only from the continuous, ordered and universal tabulation of all possible differences.”
Mathematicians never got past this obsession. That’s because the menagerie of mathematics makes the biological catalog look like a petting zoo. Its inhabitants aren’t limited by physical reality. Any conceivable possibility, whether it lives in our universe or in some hypothetical 200-dimensional one, needs to be accounted for. There are tons of different classifications to try — groups, knots, manifolds and so on — and infinitely many objects to sort in each of those classifications. Classification is how mathematicians come to know the strange, abstract world they’re studying, and how they prove major theorems about it.Take groups, a central object of study in math. The classification of “finite simple groups” — the building blocks of all groups — was one of the grandest mathematical accomplishments of the 20th century. It took dozens of mathematicians nearly 100 years to finish. In the end, they figured out that all finite simple groups fall into three buckets, except for 26 itemized outliers. A dedicated crew of mathematicians has been working on a “condensed” proof of the classification since 1994 — it currently comprises 10 volumes and several thousand pages, and still isn’t finished. But the gargantuan undertaking continues to bear fruit, recently helping to prove a decades-old conjecture that you can infer a lot about a group by examining one small part of it.
Mathematics, unfettered by the typical constraints of reality, is all about possibility. Classification gives mathematicians a way to start exploring that limitless potential…[Howlett reviews attempts to classify numbers by “type” (postive/negative, rational/irrational), and mathematical objects by “equivalency” (shapes that can be stretched or squeezed into the other without breaking or tearing, like a doughnut and and coffee cup (see here)…]
… Similarly, classification has played an important role in knot theory. Tie a knot in a piece of string, then glue the string’s ends together — that’s a mathematical knot. Knots are equivalent if one can be tangled or untangled, without cutting the string, to match the other. This mundane-sounding task has lots of mathematical uses. In 2023, five mathematicians made progress on a key conjecture in knot theory that stated that all knots with a certain property (being “slice”) must also have another (being “ribbon”), with the proof ruling out a suspected counterexample. (As an aside, I’ve often wondered why knot theorists insist on using nouns as adjectives.)
Classifications can also get more meta. Both theoretical computer scientists and mathematicians classify problems about classification based on how “hard” they are.
All these classifications turn math’s disarrayed infinitude into accessible order. It’s a first step toward reining in the deluge that pours forth from mathematical imaginings…
“The Never-Ending Struggle to Classify All Math,” from @quantamagazine.bsky.social.
* Isaac Newton
###
As we sort, we might spare a thought for the author of our title quote, Sir Isaac Newton; he died in this date in 1727. A polymath, Newton excelled in– and advanced– mathematics, physics, and astronomy; he was a theologian and a government offical (Master of the Mint)… and a dedicated alchemist. He was key to the Scientific Revolution and the Enlightenment that followed.
Newton’s book Philosophiæ Naturalis Principia Mathematica (Mathematical Principles of Natural Philosophy), first published in 1687, achieved the first great unification in physics and established classical mechanics (e.g., the Laws of Motion and the principle of universal gravitation). He also made seminal contributions to optics, and shares credit with German mathematician Gottfried Wilhelm Leibniz for formulating infinitesimal calculus. Indeed, Newton contributed to and refined the scientific method to such an extent that his work is considered the most influential in the development of modern science.
“There is only one world, the natural world, exhibiting patterns we call the ‘laws of nature’”*…

The quote above (in full, below) is the reigning substantive understanding of scientific naturalism that is commonplace today. Indeed, the modern era is often seen as the triumph of science over supernaturalism. But, as Peter Harrison explains, what really happened is far more interesting…
By any measure, the scientific revolution of the 17th century was a significant milestone in the emergence of our modern secular age. This remarkable historical moment is often understood as science finally liberating itself from the strictures of medieval religion, striking out on a new path that eschewed theological explanations and focused its attentions solely on a disenchanted, natural world. But this version of events is, at best, half true.
Medieval science, broadly speaking, had followed Aristotle in seeking explanations in terms of the inherent causal properties of natural things. God was certainly involved, at least to the extent that he had originally invested things with their natural properties and was said to ‘concur’ with their usual operations. Yet the natural world had its own agency. Beginning in the 17th century, the French philosopher and scientist René Descartes and his fellow intellectual revolutionaries dispensed with the idea of internal powers and virtues. They divested natural objects of inherent causal powers and attributed all motion and change in the universe directly to natural laws.
But, for all their transformative influence, key agents in the scientific revolution such as Descartes, Johannes Kepler, Robert Boyle and Isaac Newton are not our modern and secular forebears. They did not share our contemporary understandings of the natural or our idea of ‘laws of nature’ that we imagine underpins that naturalism…
[Harrison traces the history of the often contentious, but ultimately momentous rise of naturalism, then considers the historical acounts of that ascension– and what they gloss over or miss altogether. He then turns to whay that matters…]
… the contrived histories of naturalism that purport to show its victory over supernaturalism were fabricated in the 19th century and are simply not consistent with the historical evidence. They are also tainted by a cultural condescension that, in the past at least, descended into outright racism. Few, if any, would today endorse the chauvinism that attends these older, triumphalist accounts of the history of naturalism. Yet, it is worth reflecting upon the extent to which elements of cultural condescension necessarily colour scholarly endeavours that are premised on the imagined ‘neutral’ grounds of naturalism. Careful consideration of the contingent historical circumstances that gave rise to present analytic categories that enjoy significant standing and authority would suggest that there is nothing especially neutral or objective about them. Any clear-eyed crosscultural comparison – one that refrains from assessing worldviews in terms of how they measure up to the standard of the modern West – will reinforce this. We might go so far as to adopt a form of ‘reverse anthropology’, where we think how our own conceptions of the world might look if we adopted the frameworks of others. This might entail dispensing with the idea of the supernatural, and attempting to think outside the box of our recently inherited natural/supernatural distinction.
History [that is, the “actual” history that Harrison recounts] suggests that our regnant modern naturalism is deeply indebted to monotheism, and that its adherents may need to abandon the comforting idea that their naturalistic commitments are licensed by the success of science. As for the idea of the supernatural, ironically this turns out to be far more important for the identity of those who wish to deny its reality than it had ever been for traditional religious believers…
Fascinating and provocative: “The birth of naturalism,” from @uqpharri in @aeonmag.
* “There is only one world, the natural world, exhibiting patterns we call the ‘laws of nature’, and which is discoverable by the methods of science and empirical investigation. There is no separate realm of the supernatural, spiritual, or divine; nor is there any cosmic teleology or transcendent purpose inherent in the nature of the universe or in human life.” – Sean Carroll, The Big Picture
###
As we rethink reality, we might recall that it was on this date in 1588 that Tycho Brahe first outlined his “Tychonic system” concept of the structure of the solar system. The Tychonic system was a hybrid, sharing both the basic idea of the geocentric system of Ptolemy, and the heliocentric idea of Nicholas Copernicus. Published in his De mundi aethorei recentioribus phaenomenis, Tycho’s proposal, retaining Aristotelian physics, kept the the Sun and Moon revolving about Earth in the center of the universe and, at a great distance, the shell of the fixed stars was centered on the Earth. But like Copernicus, he agreed that Mercury, Venus, Mars, Jupiter, and Saturn revolved about the Sun. Thus he could explain the motions of the heavens without “crystal spheres” carrying the planets through complex Ptolemaic epicycles.

On this same date, in 1633, Galileo Galilei arrived in Rome to face trial before the Inquisition. His crime was professing the belief that the earth revolves around the sun– based on observations that he’d made further to Copernicus and Tycho.

“Romanticism is precisely situated neither in choice of subject, nor exact truth, but in the way of feeling”*…
The estimable Ted Gioia is exploring the possibility that we are at the cusp of a major change in the zeitgeist– the beginning of a new age of Romanticism…
I made a flippant remark a few months ago. It was almost a joke.
But then I started taking it seriously.
I said that technocracy had grown so oppressive and manipulative it would spur a backlash. And that our rebellion might resemble the Romanticist movement of the early 1800s.
We need a new Romanticism, I quipped. And we will probably get one.
A new Romanticism? Could that really happen? That seems so unlikely.
Even I didn’t take this seriously (at first). I was just joking. But during the subsequent weeks and months, I kept thinking about my half-serious claim.
I realized that, the more I looked at what happened circa 1800, the more it reminded me of our current malaise.
- Rationalist and algorithmic models were dominating every sphere of life at that midpoint in the Industrial Revolution—and people started resisting the forces of progress.
- Companies grew more powerful, promising productivity and prosperity. But Blake called them “dark Satanic mills” and Luddites started burning down factories—a drastic and futile step, almost the equivalent of throwing away your smartphone.
- Even as science and technology produced amazing results, dysfunctional behaviors sprang up everywhere. The pathbreaking literary works from the late 1700s reveal the dark side of the pervasive techno-optimism—Goethe’s novel about Werther’s suicide, the Marquis de Sade’s nasty stories, and all those gloomy Gothic novels. What happened to the Enlightenment?
- As the new century dawned, the creative class (as we would call it today) increasingly attacked rationalist currents that had somehow morphed into violent, intrusive forces in their lives—an 180 degree shift in the culture. For Blake and others, the name Newton became a term of abuse.
- Artists, especially poets and musicians, took the lead in this revolt. They celebrated human feeling and emotional attachments—embracing them as more trustworthy, more flexible, more desirable than technology, profits, and cold calculation.
That’s the world, circa 1800.
The new paradigm shocked Europe when it started to spread. Cultural elites had just assumed that science and reason would control everything in the future. But that wasn’t how it played out.
Resemblances with the current moment are not hard to see.
These considerations led me, about nine months ago, to conduct a deep dive into the history of the Romanticist movement. I wanted to see what the historical evidence told me.
…
I’m now structuring my research in chronological order—that’s a method I often use in addressing big topics.
I make no great promises for what I share below. These are just notes on what happened in Western culture from 1800 to 1804—listed year-by-year.
Sharing these is part of my process. I expect this will generate useful feedback, and guide me on the next phase of this project…
Because music is always my entry point into cultural changes, it plays a key role here in how I analyze past (and present) events. I firmly believe that music is an early indicator of social change. The notes below are offered as evidence in support of that view…
[There follows a fascinating– and compelling– account of those five years, featuring Napoleon, Haydn, Beethoven, Woodsworth, Coleridge, Herder, Schelling, the Marquis de Sade, Novalis, Ann Radcliffe, and others]
… Beethoven turns against Napoleon—and this is emblematic of the aesthetic reversal sweeping through Europe. Not long ago, Beethoven and other artists looked to French rationalism as a harbinger of a new age of freedom and individual flourishing. But this entire progress-obsessed ideology is unraveling.
It’s somehow fitting that music takes the lead role in deconstructing a tyrannical rationalism, and proposing a more human alternative.
Could that happen again?
- Imagine a growing sense that algorithmic and mechanistic thinking has become too oppressive.
- Imagine if people started resisting technology as a malicious form of control, and not a pathway to liberation, empowerment, and human flourishing—soul-nurturing riches that must come from someplace deeper.
- Imagine a revolt against STEM’s dominance and dictatorship over all other fields?
- Imagine people deciding that the good life starts with NOT learning how to code.
If that happened now, wouldn’t music stand out as the pathway? What could possibly be more opposed to brutal rationalism running out of control than a song?
But what does that kind of music sound like? In 1800, it was Beethoven. And today?…
Why it may be 1800 all over again: “Notes Toward a New Romanticism,” from @tedgioia in his terrific newsletter, The Honest Broker.
* Charles Baudelaire
###
As we review vibes on the verge, we might send rational birthday greetings to an avatar of the Enlightenment against which the Romantics rebelled, Francois-Marie Arouet, better known as Voltaire; he was born on this date in 1694. The Father of the Age of Reason, he produced works in almost every literary form: plays, poems, novels, essays, and historical and scientific works– more than 2,000 books and pamphlets (and more than 20,000 letters). He popularized Isaac Newton’s work in France by arranging a translation of Principia Mathematica to which he added his own commentary.
A social reformer, Voltaire used satire to criticize the intolerance, religious dogma, and oligopolistic privilege of his day, perhaps nowhere more sardonically than in Candide.







You must be logged in to post a comment.