Posts Tagged ‘Leibnitz’
“We must not forget that the wheel is reinvented so often because it is a very good idea”*…
… but when was it first discovered? And, and given its obvious and ubiquitous utility, why there (and not somewhere else)? Kai James offers an answer…
Imagine you’re a copper miner in southeastern Europe in the year 3900 B.C.E. Day after day you haul copper ore through the mine’s sweltering tunnels.
You’ve resigned yourself to the grueling monotony of mining life. Then one afternoon, you witness a fellow worker doing something remarkable.
With an odd-looking contraption, he casually transports the equivalent of three times his body weight on a single trip. As he returns to the mine to fetch another load, it suddenly dawns on you that your chosen profession is about to get far less taxing and much more lucrative.
What you don’t realize: You’re witnessing something that will change the course of history – not just for your tiny mining community, but for all of humanity.
Despite the wheel’s immeasurable impact, no one is certain as to who invented it, or when and where it was first conceived. The hypothetical scenario described above is based on a 2015 theory that miners in the Carpathian Mountains – in present-day Hungary – first invented the wheel nearly 6,000 years ago as a means to transport copper ore.
The theory is supported by the discovery of more than 150 miniaturized wagons by archaeologists working in the region. These pint-sized, four-wheeled models were made from clay, and their outer surfaces were engraved with a wickerwork pattern reminiscent of the basketry used by mining communities at the time. Carbon dating later revealed that these wagons are the earliest known depictions of wheeled transport to date.
This theory also raises a question of particular interest to me, an aerospace engineer who studies the science of engineering design. How did an obscure, scientifically naive mining society discover the wheel, when highly advanced civilizations, such as the ancient Egyptians, did not?…
Read on to find out: “How was the wheel invented? Computer simulations reveal the unlikely birth of a world-changing technology nearly 6,000 years ago,” from @us.theconversation.com.
* “We must not forget that the wheel is reinvented so often because it is a very good idea; I’ve learned to worry more about the soundness of ideas that were invented only once.” – David Parnas
###
As we roll along, we might we might send a “Alles Gute zum Geburtstag” to man at the center of the question of the invention of another foundational “technology”: the polymathic Gottfried Wilhelm Leibniz, the philosopher, mathematician, inventor (of, among other things, an early calculator) and political adviser.
Leibnitz was important both as a metaphysician and as a logician, but who is probably best remembered for his independent invention of the calculus; he was born on this date in 1646. Leibniz independently discovered and developed differential and integral calculus, which he published in 1684; but he became involved in a bitter priority dispute with Isaac Newton, whose ideas on the calculus were developed earlier (1665), but published later (1687). Scholars largely agree that, in fact, Leibnitz and Newton independently developed “the greatest advance in mathematics that had taken place since the time of Archimedes.”

“Great minds think alike”*…

Brian Potter on the (perhaps surprising) frequency with which “heroic” inventors are in fact better understood as the winners of close races…
When Alexander Graham Bell filed a patent for the telephone on February 14th, 1876, he beat competing telephone developer Elisha Gray to the patent office by just a few hours. The resulting legal dispute between Bell Telephone and Western Union (which owned the rights to Gray’s invention) would consume millions of dollars before being resolved in Bell’s favor in 1879.
Such cases of multiple invention are common, and some of the most famous and important modern inventions were invented in parallel. Both Thomas Edison and Joseph Swan patented incandescent lightbulbs in 1880. Jack Kilby and Robert Noyce patented integrated circuits in 1959. Hans von Ohain and Frank Whittle independently invented the jet engine in the 1930s. In a 1922 paper, William Ogburn and Dorothy Thomas documented 150 cases of multiple discovery in science and technology. Robert Merton found 261 examples in 1961, and observed that the phenomenon of multiple discovery was itself a multiple discovery, having been described over and over again since at least the early 19th century.
But exactly how common is multiple invention? The frequency of examples suggests that it can’t be particularly rare, but that doesn’t tell us the rate at which it occurs. In “How Common is Independent Discovery?,” Matt Clancy catalogues several attempts to estimate the frequency of multiple discovery, and tentatively comes up with a frequency of around 2-3% for simultaneous scientific discoveries, and perhaps an 8% chance that a given invention will be reinvented in the next decade. But the evidence for inventions is somewhat inconsistent, and varies greatly between studies. Clancy estimates a reinvention rate of around 8% per decade, but another study he found that looked at patent interference lawsuits between 1998 and 2014 suggests an independent invention rate of only around 0.02% per year.
The frequency of multiple invention is a useful thing to know, because it can give us clues about the nature of technological progress. A very low rate of multiple invention suggests that progress might be driven by a small number of “genius” inventors (what we might call the Great Man Theory of technological progress), and that it might be highly historically contingent (if you re-rolled the dice of history, maybe you get a totally new set of inventions and a different technological palette). A high rate of multiple invention suggests that progress is more a function of broad historical forces (that inventions appear when the conditions are right), and that progress is less contingent (if you re-rolled the dice of history, you’d get a similar progression of inventions). And if the rate of multiple invention is changing over time, perhaps the nature of technological progress is changing as well…
[Potter reviews the history and concludes that “multiple invention was extremely common”…]
… My main takeaway is that the ideas behind inventions are often in some sense “obvious,” or at least not so surprising or unexpected that many people won’t think of them. In some cases, this is probably because once some new possibility comes along, lots of people think of similar things that could be done with it. Once the properties of electricity began to be understood, many people came up with the idea of using it to send signals (telephone, telegraph), or to create motion (engines and generators), or to generate light (arc lamps, incandescent lights). Once the steam engine came along, lots of people had the idea to use it to power various types of vehicles.
In other cases, multiple invention probably occurs because important problems will attract many people trying to solve them. Steel corrosion was a large problem inspiring many folks to look for ways to create a steel that didn’t rust, or notice the potential value if they stumbled across such a material. Lamps causing mine fires were a major problem, inspiring many people to come up with ideas for safety lamps. The smoke produced by gunpowder was a major problem, inspiring many efforts to develop smokeless powders. And because would-be inventors will all draw from the same pool of available technologies, materials, and capabilities when coming up with a solution, there will be a large degree of convergence in the solutions they come up with…
Fascinating: “How Common is Multiple Invention?” from @const-physics.blogsky.venki.dev.
* common idiom
###
As we reconsider credit, we might recall that it was on this date in 1661 that Isaac Newton— a key figure in the Scientific Revolution and the Enlightenment that followed– entered Trinity College, Cambridge. Soon after Newton obtained his BA degree at Cambridge in August 1665, the university temporarily closed as a precaution against the Great Plague. Although he had been undistinguished as a Cambridge student, his private studies and the years following his bachelor’s degree have been described as “the richest and most productive ever experienced by a scientist.”
Relevantly to the piece above, Newton was party to a dispute with Gottfried Wilhelm Leibniz (who started, at age 14, at the University of Leipzig the same year that Newton matriculated at Cambridge) over which of them developed calculus– called “the greatest advance in mathematics that had taken place since the time of Archimedes.” The modern consensus is that the two men independently developed their ideas.

“There is nothing waste, nothing sterile, nothing dead in the universe; no chaos, no confusions, save in appearance”*…
Still, appearances mattered to Leibnitz. And as Richard Halpern explains in a piece adapted from his new book, Leibnizing: A Philosopher in Motion, they give us another avenue to understanding his philosophy…
Possessed of a monumentally impressive intellect, the philosopher and mathematician Gottfried Wilhelm Leibniz (1646–1716) was not blessed with a body to match. Bald, short, and unhandsome of feature, he accordingly availed himself of that universal male cosmetic—and prosthetic—of his era, the peruke (figure 1). Leibniz’s peruke ameliorated several bodily shortcomings: it covered his bald pate, including a bony growth the size of a pigeon’s egg that purportedly sat there; it added several inches to his height; and it did not so much frame his face as distract attention from it.
Leibniz was hardly the only seventeenth-century philosopher to sport a wig: René Descartes and John Locke did so as well. Theirs were not quite so extravagant and luxurious as Leibniz’s, however, nor did they give quite the same impression that a poodle had curled up for a nap on the wearer’s head. In the portrait reproduced here, by the fashionable court painter Christoph Bernhard Francke, Leibniz’s peruke complements the rich velvet folds of his garment to project an aura of prosperity, prestige, and fashion. Leibniz, who was fond of perfume as well as of perukes, made no bones about his wish to be included in polite society. The duke of Orleans was sufficiently impressed with his elegance to declare: “It is unusual for intellectuals to dress well, not to smell bad, and to understand jokes.”
Leibniz’s peruke silently poses questions: Should philosophers concern themselves with reputation, physical appearance, and fashion in the way that Leibniz does? Shouldn’t the philosopher focus rather on the disinterested pursuit of truth? Ever since Diogenes the Cynic, poverty and simplicity have served as emblems of philosophical authenticity. If we no longer demand that our philosophers be poor, we expect at least a certain slovenliness—a sign that their attention is directed elsewhere, upon more fundamental matters, and not on their appearance.
John Locke seems to make a related point in the dedicatory epistle to An Essay Concerning Human Understanding: “The Imposition of Novelty is a terrible Charge among those, who judge of Men’s Heads as they do their Perukes, by the Fashion; and can allow none to be right, but the received Doctrines.” The philosopher is supposed to be defined by what goes on in his or her head, not by what is perched upon it. Philosophers pursue truth, but the wig is an emblem of falsehood. The philosopher investigates eternal verities, but the wig occupies the ephemeral realm of fashion. In The Wig: A Harebrained History (London: Reaktion, 2020), Luigi Amara posits the wig as the supremely antiphilosophical object, more at home with the deceptive rhetorical chicanery of the Sophists (and for that reason also a supreme philosophical provocation).
But if philosophy and wigs are conceptually incompatible, this fact did not seem to bother Leibniz, who was perfectly comfortable with both. I would like to suggest, indeed, that the wig takes on enhanced significance if juxtaposed not only to philosophy in general but also to Leibniz’s philosophy in particular. One of the things Leibnizian metaphysics does is take Cartesian dualism and push it to an extreme: bodies and minds not only are of essentially different natures, as Descartes held, but also because of this they do not interact at all. But if bodies and minds cannot affect one another causally, Lebiniz argued, they nevertheless express each other. Every mental event is accompanied by some change in the bodily state of the entity experiencing it and vice versa. These expressive relations are not the result of direct mutual influence but are created by God as part of what Leibniz called pre-established harmony. In place of causal relations between mind and body, then, Leibniz posits something more like aesthetic ones.
Leibniz’s philosophy would claim, therefore, that his own bodily appearance is not unrelated to what goes on in his head…
On the philosophical Importance of fake hair: “Leibniz’s Peruke,” from @ColumbiaUP.
* Gottfried Wilhelm Leibniz
###
As we ruminate on rugs, we might send insightful birthday greetings to Robert Pirsig; he was born on this date in 1928. A writer and philosopher, he is best known for Zen and the Art of Motorcycle Maintenance: An Inquiry into Values, an exploration of the underlying metaphysics of Western culture.
Pirsig had great difficulty finding a publisher for Zen and the Art of Motorcycle Maintenance. He pitched the idea for his book to 121 different publishers, sending them a cover letter along with two sample pages; only 22 responding favorably, and then only tentatively. Ultimately, an editor at William Morrow accepted the finished manuscript; when he did, his publisher’s internal recommendation averred, “This book is brilliant beyond belief, it is probably a work of genius, and will, I’ll wager, attain classic stature.” Indeed, in his review, George Steiner compared Pirsig’s writing to Dostoevsky, Broch, Proust, and Bergson, arguing that “the assertion itself is valid … the analogies with Moby-Dick are patent.”
“Machines take me by surprise with great frequency”*…
In search of universals in the 17th century, Gottfried Leibniz imagined the calculus ratiocinator, a theoretical logical calculation framework aimed at universal application, that led Norbert Wiener to suggest that Leibniz should be considered the patron saint of cybernetics. In the 19th century, Charles Babbage and Ada Lovelace took a pair of whacks at making it real.
Ironically, it was confronting the impossibility of a universal calculator that led to modern computing. In 1936 (the same year that Charlie Chaplin released Modern Times) Alan Turing (following on Godel’s demonstration that mathematics is incomplete and addressing Hilbert‘s “decision problem,” querying the limits of computation) published the (notional) design of a “machine” that elegantly demonstrated those limits– and, as Sheon Han explains, birthed computing as we know it…
… [Hilbert’s] question would lead to a formal definition of computability, one that allowed mathematicians to answer a host of new problems and laid the foundation for theoretical computer science.
The definition came from a 23-year-old grad student named Alan Turing, who in 1936 wrote a seminal paper that not only formalized the concept of computation, but also proved a fundamental question in mathematics and created the intellectual foundation for the invention of the electronic computer. Turing’s great insight was to provide a concrete answer to the computation question in the form of an abstract machine, later named the Turing machine by his doctoral adviser, Alonzo Church. It’s abstract because it doesn’t (and can’t) physically exist as a tangible device. Instead, it’s a conceptual model of computation: If the machine can calculate a function, then the function is computable.
…
With his abstract machine, Turing established a model of computation to answer the Entscheidungsproblem, which formally asks: Given a set of mathematical axioms, is there a mechanical process — a set of instructions, which today we’d call an algorithm — that can always determine whether a given statement is true?…
… in 1936, Church and Turing — using different methods — independently proved that there is no general way of solving every instance of the Entscheidungsproblem. For example, some games, such as John Conway’s Game of Life, are undecidable: No algorithm can determine whether a certain pattern will appear from an initial pattern.
…
Beyond answering these fundamental questions, Turing’s machine also led directly to the development of modern computers, through a variant known as the universal Turing machine. This is a special kind of Turing machine that can simulate any other Turing machine on any input. It can read a description of other Turing machines (their rules and input tapes) and simulate their behaviors on its own input tape, producing the same output that the simulated machine would produce, just as today’s computers can read any program and execute it. In 1945, John von Neumann proposed a computer architecture — called the von Neumann architecture — that made the universal Turing machine concept possible in a real-life machine…
As Turing said, “if a machine is expected to be infallible, it cannot also be intelligent.” On the importance of thought experiments: “The Most Important Machine That Was Never Built,” from @sheonhan in @QuantaMagazine.
* Alan Turing
###
As we sum it up, we might spare a thought for Martin Gardner; he died on this date in 2010. Though not an academic, nor ever a formal student of math or science, he wrote widely and prolifically on both subjects in such popular books as The Ambidextrous Universe and The Relativity Explosion and as the “Mathematical Games” columnist for Scientific American. Indeed, his elegant– and understandable– puzzles delighted professional and amateur readers alike, and helped inspire a generation of young mathematicians.
Gardner’s interests were wide; in addition to the math and science that were his power alley, he studied and wrote on topics that included magic, philosophy, religion, and literature (c.f., especially his work on Lewis Carroll– including the delightful Annotated Alice— and on G.K. Chesterton). And he was a fierce debunker of pseudoscience: a founding member of CSICOP, and contributor of a monthly column (“Notes of a Fringe Watcher,” from 1983 to 2002) in Skeptical Inquirer, that organization’s monthly magazine.







You must be logged in to post a comment.