(Roughly) Daily

Posts Tagged ‘Isaac Newton

“Great minds think alike”*…

Illustrations of early telephone concepts from Alexander Graham Bell, including sketches of apparatus for transmitting sound through wires.
Patent sketches of Gray’s (top) and Bell’s (bottom) telephones, via Hounshell

Brian Potter on the (perhaps surprising) frequency with which “heroic” inventors are in fact better understood as the winners of close races…

When Alexander Graham Bell filed a patent for the telephone on February 14th, 1876, he beat competing telephone developer Elisha Gray to the patent office by just a few hours. The resulting legal dispute between Bell Telephone and Western Union (which owned the rights to Gray’s invention) would consume millions of dollars before being resolved in Bell’s favor in 1879.

Such cases of multiple invention are common, and some of the most famous and important modern inventions were invented in parallel. Both Thomas Edison and Joseph Swan patented incandescent lightbulbs in 1880. Jack Kilby and Robert Noyce patented integrated circuits in 1959. Hans von Ohain and Frank Whittle independently invented the jet engine in the 1930s. In a 1922 paper, William Ogburn and Dorothy Thomas documented 150 cases of multiple discovery in science and technology. Robert Merton found 261 examples in 1961, and observed that the phenomenon of multiple discovery was itself a multiple discovery, having been described over and over again since at least the early 19th century.

But exactly how common is multiple invention? The frequency of examples suggests that it can’t be particularly rare, but that doesn’t tell us the rate at which it occurs. In “How Common is Independent Discovery?,” Matt Clancy catalogues several attempts to estimate the frequency of multiple discovery, and tentatively comes up with a frequency of around 2-3% for simultaneous scientific discoveries, and perhaps an 8% chance that a given invention will be reinvented in the next decade. But the evidence for inventions is somewhat inconsistent, and varies greatly between studies. Clancy estimates a reinvention rate of around 8% per decade, but another study he found that looked at patent interference lawsuits between 1998 and 2014 suggests an independent invention rate of only around 0.02% per year.

The frequency of multiple invention is a useful thing to know, because it can give us clues about the nature of technological progress. A very low rate of multiple invention suggests that progress might be driven by a small number of “genius” inventors (what we might call the Great Man Theory of technological progress), and that it might be highly historically contingent (if you re-rolled the dice of history, maybe you get a totally new set of inventions and a different technological palette). A high rate of multiple invention suggests that progress is more a function of broad historical forces (that inventions appear when the conditions are right), and that progress is less contingent (if you re-rolled the dice of history, you’d get a similar progression of inventions). And if the rate of multiple invention is changing over time, perhaps the nature of technological progress is changing as well…

[Potter reviews the history and concludes that “multiple invention was extremely common”…]

… My main takeaway is that the ideas behind inventions are often in some sense “obvious,” or at least not so surprising or unexpected that many people won’t think of them. In some cases, this is probably because once some new possibility comes along, lots of people think of similar things that could be done with it. Once the properties of electricity began to be understood, many people came up with the idea of using it to send signals (telephone, telegraph), or to create motion (engines and generators), or to generate light (arc lamps, incandescent lights). Once the steam engine came along, lots of people had the idea to use it to power various types of vehicles.

In other cases, multiple invention probably occurs because important problems will attract many people trying to solve them. Steel corrosion was a large problem inspiring many folks to look for ways to create a steel that didn’t rust, or notice the potential value if they stumbled across such a material. Lamps causing mine fires were a major problem, inspiring many people to come up with ideas for safety lamps. The smoke produced by gunpowder was a major problem, inspiring many efforts to develop smokeless powders. And because would-be inventors will all draw from the same pool of available technologies, materials, and capabilities when coming up with a solution, there will be a large degree of convergence in the solutions they come up with…

Fascinating: “How Common is Multiple Invention?” from @const-physics.blogsky.venki.dev‬.

* common idiom

###

As we reconsider credit, we might recall that it was on this date in 1661 that Isaac Newton— a key figure in the Scientific Revolution and the Enlightenment that followed– entered Trinity College, Cambridge. Soon after Newton obtained his BA degree at Cambridge in August 1665, the university temporarily closed as a precaution against the Great Plague. Although he had been undistinguished as a Cambridge student, his private studies and the years following his bachelor’s degree have been described as “the richest and most productive ever experienced by a scientist.”

Relevantly to the piece above, Newton was party to a dispute with Gottfried Wilhelm Leibniz (who started, at age 14, at the University of Leipzig the same year that Newton matriculated at Cambridge) over which of them developed calculus– called “the greatest advance in mathematics that had taken place since the time of Archimedes.”  The modern consensus is that the two men independently developed their ideas. 

Statues of Isaac Newton and Gottfried Wilhelm Leibniz, illustrating their historical significance in mathematics and the development of calculus.
Statues of Isaac Newton and Gottfried Wilhelm Leibniz in the courtyard of the Oxford University Museum of Natural History (source)

Written by (Roughly) Daily

June 9, 2025 at 1:00 am

“Truth is ever to be found in the simplicity, and not in the multiplicity and confusion of things”*…

Knots with 8 crossings

From Kim (Scott) Morrison‘s and Dror Bar-Natan‘s, The Knot Atlas, “a complete user-editable knot atlas, in the wiki spirit of Wikipedia“– a marvelous example of a wide-spread urge in mathematics to find order through classification. As Joseph Howlett explains, that quest continues, even as it proves vexatious…

Biology in the 18th century was all about taxonomy. The staggering diversity of life made it hard to draw conclusions about how it came to be. Scientists first had to put things in their proper order, grouping species according to shared characteristics — no easy task. Since then, they’ve used these grand catalogs to understand the differences among organisms and to infer their evolutionary histories. Chemists built the periodic table for the same purpose — to classify the elements and understand their behaviors. And physicists made the Standard Model to explain how the fundamental particles of the universe interact.
 
In his book The Order of Things, the philosopher Michel Foucault describes this preoccupation with sorting as a formative step for the sciences. “A knowledge of empirical individuals,” he wrote, “can be acquired only from the continuous, ordered and universal tabulation of all possible differences.”
 
Mathematicians never got past this obsession. That’s because the menagerie of mathematics makes the biological catalog look like a petting zoo. Its inhabitants aren’t limited by physical reality. Any conceivable possibility, whether it lives in our universe or in some hypothetical 200-dimensional one, needs to be accounted for. There are tons of different classifications to try — groups, knots, manifolds and so on — and infinitely many objects to sort in each of those classifications. Classification is how mathematicians come to know the strange, abstract world they’re studying, and how they prove major theorems about it.

Take groups, a central object of study in math. The classification of “finite simple groups” — the building blocks of all groups — was one of the grandest mathematical accomplishments of the 20th century. It took dozens of mathematicians nearly 100 years to finish. In the end, they figured out that all finite simple groups fall into three buckets, except for 26 itemized outliers. A dedicated crew of mathematicians has been working on a “condensed” proof of the classification since 1994 — it currently comprises 10 volumes and several thousand pages, and still isn’t finished. But the gargantuan undertaking continues to bear fruit, recently helping to prove a decades-old conjecture that you can infer a lot about a group by examining one small part of it.
 
Mathematics, unfettered by the typical constraints of reality, is all about possibility. Classification gives mathematicians a way to start exploring that limitless potential…

[Howlett reviews attempts to classify numbers by “type” (postive/negative, rational/irrational), and mathematical objects by “equivalency” (shapes that can be stretched or squeezed into the other without breaking or tearing, like a doughnut and and coffee cup (see here)…]

… Similarly, classification has played an important role in knot theory. Tie a knot in a piece of string, then glue the string’s ends together — that’s a mathematical knot. Knots are equivalent if one can be tangled or untangled, without cutting the string, to match the other. This mundane-sounding task has lots of mathematical uses. In 2023, five mathematicians made progress on a key conjecture in knot theory that stated that all knots with a certain property (being “slice”) must also have another (being “ribbon”), with the proof ruling out a suspected counterexample. (As an aside, I’ve often wondered why knot theorists insist on using nouns as adjectives.)

Classifications can also get more meta. Both theoretical computer scientists and mathematicians classify problems about classification based on how “hard” they are.
 
All these classifications turn math’s disarrayed infinitude into accessible order. It’s a first step toward reining in the deluge that pours forth from mathematical imaginings…

The Never-Ending Struggle to Classify All Math,” from @quantamagazine.bsky.social.

* Isaac Newton

###

As we sort, we might spare a thought for the author of our title quote, Sir Isaac Newton; he died in this date in 1727. A polymath, Newton excelled in– and advanced–  mathematics, physics, and astronomy; he was a theologian and a government offical (Master of the Mint)… and a dedicated alchemist. He was key to the Scientific Revolution and the Enlightenment that followed.

Newton’s book Philosophiæ Naturalis Principia Mathematica (Mathematical Principles of Natural Philosophy), first published in 1687, achieved the first great unification in physics and established classical mechanics (e.g., the Laws of Motion and the principle of universal gravitation). He also made seminal contributions to optics, and shares credit with German mathematician Gottfried Wilhelm Leibniz for formulating infinitesimal calculus.  Indeed, Newton contributed to and refined the scientific method to such an extent that his work is considered the most influential in the development of modern science.

source

Written by (Roughly) Daily

March 20, 2025 at 1:00 am

“There is only one world, the natural world, exhibiting patterns we call the ‘laws of nature’”*…

From Liber Divinorum Operum (The Book of Divine Works) by Hildegard von Bingen, composed between 1163 and 1174

The quote above (in full, below) is the reigning substantive understanding of scientific naturalism that is commonplace today. Indeed, the modern era is often seen as the triumph of science over supernaturalism. But, as Peter Harrison explains, what really happened is far more interesting…

By any measure, the scientific revolution of the 17th century was a significant milestone in the emergence of our modern secular age. This remarkable historical moment is often understood as science finally liberating itself from the strictures of medieval religion, striking out on a new path that eschewed theological explanations and focused its attentions solely on a disenchanted, natural world. But this version of events is, at best, half true.

Medieval science, broadly speaking, had followed Aristotle in seeking explanations in terms of the inherent causal properties of natural things. God was certainly involved, at least to the extent that he had originally invested things with their natural properties and was said to ‘concur’ with their usual operations. Yet the natural world had its own agency. Beginning in the 17th century, the French philosopher and scientist René Descartes and his fellow intellectual revolutionaries dispensed with the idea of internal powers and virtues. They divested natural objects of inherent causal powers and attributed all motion and change in the universe directly to natural laws.

But, for all their transformative influence, key agents in the scientific revolution such as Descartes, Johannes Kepler, Robert Boyle and Isaac Newton are not our modern and secular forebears. They did not share our contemporary understandings of the natural or our idea of ‘laws of nature’ that we imagine underpins that naturalism…

[Harrison traces the history of the often contentious, but ultimately momentous rise of naturalism, then considers the historical acounts of that ascension– and what they gloss over or miss altogether. He then turns to whay that matters…]

… the contrived histories of naturalism that purport to show its victory over supernaturalism were fabricated in the 19th century and are simply not consistent with the historical evidence. They are also tainted by a cultural condescension that, in the past at least, descended into outright racism. Few, if any, would today endorse the chauvinism that attends these older, triumphalist accounts of the history of naturalism. Yet, it is worth reflecting upon the extent to which elements of cultural condescension necessarily colour scholarly endeavours that are premised on the imagined ‘neutral’ grounds of naturalism. Careful consideration of the contingent historical circumstances that gave rise to present analytic categories that enjoy significant standing and authority would suggest that there is nothing especially neutral or objective about them. Any clear-eyed crosscultural comparison – one that refrains from assessing worldviews in terms of how they measure up to the standard of the modern West – will reinforce this. We might go so far as to adopt a form of ‘reverse anthropology’, where we think how our own conceptions of the world might look if we adopted the frameworks of others. This might entail dispensing with the idea of the supernatural, and attempting to think outside the box of our recently inherited natural/supernatural distinction.

History [that is, the “actual” history that Harrison recounts] suggests that our regnant modern naturalism is deeply indebted to monotheism, and that its adherents may need to abandon the comforting idea that their naturalistic commitments are licensed by the success of science. As for the idea of the supernatural, ironically this turns out to be far more important for the identity of those who wish to deny its reality than it had ever been for traditional religious believers…

Fascinating and provocative: “The birth of naturalism,” from @uqpharri in @aeonmag.

* “There is only one world, the natural world, exhibiting patterns we call the ‘laws of nature’, and which is discoverable by the methods of science and empirical investigation. There is no separate realm of the supernatural, spiritual, or divine; nor is there any cosmic teleology or transcendent purpose inherent in the nature of the universe or in human life.” – Sean Carroll, The Big Picture

###

As we rethink reality, we might recall that it was on this date in 1588 that Tycho Brahe first outlined his “Tychonic system” concept of the structure of the solar system. The Tychonic system was a hybrid, sharing both the basic idea of the geocentric system of Ptolemy, and the heliocentric idea of Nicholas Copernicus. Published in his De mundi aethorei recentioribus phaenomenis, Tycho’s proposal, retaining Aristotelian physics, kept the the Sun and Moon revolving about Earth in the center of the universe and, at a great distance, the shell of the fixed stars was centered on the Earth. But like Copernicus, he agreed that Mercury, Venus, Mars, Jupiter, and Saturn revolved about the Sun. Thus he could explain the motions of the heavens without “crystal spheres” carrying the planets through complex Ptolemaic epicycles.

A 17th century illustration of the Hypothesis Tychonica (source)

On this same date, in 1633, Galileo Galilei arrived in Rome to face trial before the Inquisition. His crime was professing the belief that the earth revolves around the sun– based on observations that he’d made further to Copernicus and Tycho.

Cristiano Banti‘s 1857 painting Galileo facing the Roman Inquisition (source)

“The pursuit of science is a grand adventure, driven by curiosity, fueled by passion, and guided by reason”*…

Adam Mastroianni on how science advances (and how it’s held back), with a provocative set of suggestions for how it might be accelerated…

There are two kinds of problems in the world: strong-link problems and weak-link problems.

Weak-link problems are problems where the overall quality depends on how good the worst stuff is. You fix weak-link problems by making the weakest links stronger, or by eliminating them entirely.

Food safety, for example, is a weak-link problem. You don’t want to eat anything that will kill you. That’s why it makes sense for the Food and Drug Administration to inspect processing plants, to set standards, and to ban dangerous foods…

Weak-link problems are everywhere. A car engine is a weak-link problem: it doesn’t matter how great your spark plugs are if your transmission is busted. Nuclear proliferation is a weak-link problem: it would be great if, say, France locked up their nukes even tighter, but the real danger is some rogue nation blowing up the world. Putting on too-tight pants is a weak-link problem: they’re gonna split at the seams.

It’s easy to assume that all problems are like this, but they’re not. Some problems are strong-link problems: overall quality depends on how good the best stuff is, and the bad stuff barely matters. Like music, for instance. You listen to the stuff you like the most and ignore the rest. When your favorite band releases a new album, you go “yippee!” When a band you’ve never heard of and wouldn’t like anyway releases a new album, you go…nothing at all, you don’t even know it’s happened. At worst, bad music makes it a little harder for you to find good music, or it annoys you by being played on the radio in the grocery store while you’re trying to buy your beetle-free asparagus…

Strong-link problems are everywhere; they’re just harder to spot. Winning the Olympics is a strong-link problem: all that matters is how good your country’s best athletes are. Friendships are a strong-link problem: you wouldn’t trade your ride-or-dies for better acquaintances. Venture capital is a strong-link problem: it’s fine to invest in a bunch of startups that go bust as long as one of them goes to a billion…

In the long run, the best stuff is basically all that matters, and the bad stuff doesn’t matter at all. The history of science is littered with the skulls of dead theories. No more phlogiston nor phlegm, no more luminiferous ether, no more geocentrism, no more measuring someone’s character by the bumps on their head, no more barnacles magically turning into geese, no more invisible rays shooting out of people’s eyes, no more plum pudding

Our current scientific beliefs are not a random mix of the dumbest and smartest ideas from all of human history, and that’s because the smarter ideas stuck around while the dumber ones kind of went nowhere, on average—the hallmark of a strong-link problem. That doesn’t mean better ideas win immediately. Worse ideas can soak up resources and waste our time, and frauds can mislead us temporarily. It can take longer than a human lifetime to figure out which ideas are better, and sometimes progress only happens when old scientists die. But when a theory does a better job of explaining the world, it tends to stick around.

(Science being a strong-link problem doesn’t mean that science is currently strong. I think we’re still living in the Dark Ages, just less dark than before.)

Here’s the crazy thing: most people treat science like it’s a weak-link problem.

Peer reviewing publications and grant proposals, for example, is a massive weak-link intervention. We spend ~15,000 collective years of effort every year trying to prevent bad research from being published. We force scientists to spend huge chunks of time filling out grant applications—most of which will be unsuccessful—because we want to make sure we aren’t wasting our money…

I think there are two reasons why scientists act like science is a weak-link problem.

The first reason is fear. Competition for academic jobs, grants, and space in prestigious journals is more cutthroat than ever. When a single member of a grant panel, hiring committee, or editorial board can tank your career, you better stick to low-risk ideas. That’s fine when we’re trying to keep beetles out of asparagus, but it’s not fine when we’re trying to discover fundamental truths about the world…

The second reason is status. I’ve talked to a lot of folks since I published The rise and fall of peer review and got a lot of comments, and I’ve realized that when scientists tell me, “We need to prevent bad research from being published!” they often mean, “We need to prevent people from gaining academic status that they don’t deserve!” That is, to them, the problem with bad research isn’t really that it distorts the scientific record. The problem with bad research is that it’s cheating

I get that. It’s maddening to watch someone get ahead using shady tactics, and it might seem like the solution is to tighten the rules so we catch more of the cheaters. But that’s weak-link thinking. The real solution is to care less about the hierarchy

Here’s our reward for a generation of weak-link thinking.

The US government spends ~10x more on science today than it did in 1956, adjusted for inflation. We’ve got loads more scientists, and they publish way more papers. And yet science is less disruptive than ever, scientific productivity has been falling for decades, and scientists rate the discoveries of decades ago as worthier than the discoveries of today. (Reminder, if you want to blame this on ideas getting harder to find, I will fight you.)…

Whether we realize it or not, we’re always making calls like this. Whenever we demand certificates, credentials, inspections, professionalism, standards, and regulations, we are saying: “this is a weak-link problem; we must prevent the bad!”

Whenever we demand laissez-faire, the cutting of red tape, the letting of a thousand flowers bloom, we are saying: “this is a strong-link problem; we must promote the good!”

When we get this right, we fill the world with good things and rid the world of bad things. When we don’t, we end up stunting science for a generation. Or we end up eating a lot of asparagus beetles…

Science is a strong-link problem,” from @a_m_mastroianni in @science_seeds.

* James Clerk Maxwell

###

As we ponder the process of progress, we might spare a thought for Sir Christopher Wren; he died on this date in 1723.  A mathematician and astronomer (who co-founded and later served as president of the Royal Society), he is better remembered as one of the most highly acclaimed English architects in history; he was given responsibility for rebuilding 52 churches in the City of London after the Great Fire in 1666, including what is regarded as his masterpiece, St. Paul’s Cathedral, on Ludgate Hill.

Wren, whose scientific work ranged broadly– e.g., he invented a “weather clock” similar to a modern barometer, new engraving methods, and helped develop a blood transfusion technique– was admired by Isaac Newton, as Newton noted in the Principia.

 source

“Romanticism is precisely situated neither in choice of subject, nor exact truth, but in the way of feeling”*…

Beethoven at 30 (1800)

The estimable Ted Gioia is exploring the possibility that we are at the cusp of a major change in the zeitgeist– the beginning of a new age of Romanticism…

I made a flippant remark a few months ago. It was almost a joke.

But then I started taking it seriously.

I said that technocracy had grown so oppressive and manipulative it would spur a backlash. And that our rebellion might resemble the Romanticist movement of the early 1800s.

We need a new Romanticism, I quipped. And we will probably get one.

A new Romanticism? Could that really happen? That seems so unlikely.

Even I didn’t take this seriously (at first). I was just joking. But during the subsequent weeks and months, I kept thinking about my half-serious claim.

I realized that, the more I looked at what happened circa 1800, the more it reminded me of our current malaise.

  • Rationalist and algorithmic models were dominating every sphere of life at that midpoint in the Industrial Revolution—and people started resisting the forces of progress.
  • Companies grew more powerful, promising productivity and prosperity. But Blake called them “dark Satanic mills” and Luddites started burning down factories—a drastic and futile step, almost the equivalent of throwing away your smartphone.
  • Even as science and technology produced amazing results, dysfunctional behaviors sprang up everywhere. The pathbreaking literary works from the late 1700s reveal the dark side of the pervasive techno-optimism—Goethe’s novel about Werther’s suicide, the Marquis de Sade’s nasty stories, and all those gloomy Gothic novels. What happened to the Enlightenment?
  • As the new century dawned, the creative class (as we would call it today) increasingly attacked rationalist currents that had somehow morphed into violent, intrusive forces in their lives—an 180 degree shift in the culture. For Blake and others, the name Newton became a term of abuse.
  • Artists, especially poets and musicians, took the lead in this revolt. They celebrated human feeling and emotional attachments—embracing them as more trustworthy, more flexible, more desirable than technology, profits, and cold calculation.

That’s the world, circa 1800.

The new paradigm shocked Europe when it started to spread. Cultural elites had just assumed that science and reason would control everything in the future. But that wasn’t how it played out.

Resemblances with the current moment are not hard to see.

These considerations led me, about nine months ago, to conduct a deep dive into the history of the Romanticist movement. I wanted to see what the historical evidence told me.

I’m now structuring my research in chronological order—that’s a method I often use in addressing big topics.

I make no great promises for what I share below. These are just notes on what happened in Western culture from 1800 to 1804—listed year-by-year.

Sharing these is part of my process. I expect this will generate useful feedback, and guide me on the next phase of this project…

Because music is always my entry point into cultural changes, it plays a key role here in how I analyze past (and present) events. I firmly believe that music is an early indicator of social change. The notes below are offered as evidence in support of that view…

[There follows a fascinating– and compelling– account of those five years, featuring Napoleon, Haydn, Beethoven, Woodsworth, Coleridge, Herder, Schelling, the Marquis de Sade, Novalis, Ann Radcliffe, and others]

… Beethoven turns against Napoleon—and this is emblematic of the aesthetic reversal sweeping through Europe. Not long ago, Beethoven and other artists looked to French rationalism as a harbinger of a new age of freedom and individual flourishing. But this entire progress-obsessed ideology is unraveling.

It’s somehow fitting that music takes the lead role in deconstructing a tyrannical rationalism, and proposing a more human alternative.

Could that happen again?

  • Imagine a growing sense that algorithmic and mechanistic thinking has become too oppressive.
  • Imagine if people started resisting technology as a malicious form of control, and not a pathway to liberation, empowerment, and human flourishing—soul-nurturing riches that must come from someplace deeper.
  • Imagine a revolt against STEM’s dominance and dictatorship over all other fields?
  • Imagine people deciding that the good life starts with NOT learning how to code.

If that happened now, wouldn’t music stand out as the pathway? What could possibly be more opposed to brutal rationalism running out of control than a song?

But what does that kind of music sound like? In 1800, it was Beethoven. And today?…

Why it may be 1800 all over again: “Notes Toward a New Romanticism,” from @tedgioia in his terrific newsletter, The Honest Broker.

* Charles Baudelaire

###

As we review vibes on the verge, we might send rational birthday greetings to an avatar of the Enlightenment against which the Romantics rebelled, Francois-Marie Arouet, better known as Voltaire; he was born on this date in 1694.  The Father of the Age of Reason, he produced works in almost every literary form: plays, poems, novels, essays, and historical and scientific works– more than 2,000 books and pamphlets (and more than 20,000 letters).  He popularized Isaac Newton’s work in France by arranging a translation of Principia Mathematica to which he added his own commentary.

A social reformer, Voltaire used satire to criticize the intolerance, religious dogma, and oligopolistic privilege of his day, perhaps nowhere more sardonically than in Candide.

source

Written by (Roughly) Daily

November 21, 2023 at 1:00 am