(Roughly) Daily

Posts Tagged ‘Leibnitz

“There is nothing waste, nothing sterile, nothing dead in the universe; no chaos, no confusions, save in appearance”*…

Still, appearances mattered to Leibnitz. And as Richard Halpern explains in a piece adapted from his new book, Leibnizing: A Philosopher in Motion, they give us another avenue to understanding his philosophy…

Possessed of a monumentally impressive intellect, the philosopher and mathematician Gottfried Wilhelm Leibniz (1646–1716) was not blessed with a body to match. Bald, short, and unhandsome of feature, he accordingly availed himself of that universal male cosmetic—and prosthetic—of his era, the peruke (figure 1). Leibniz’s peruke ameliorated several bodily shortcomings: it covered his bald pate, including a bony growth the size of a pigeon’s egg that purportedly sat there; it added several inches to his height; and it did not so much frame his face as distract attention from it.

Leibniz was hardly the only seventeenth-century philosopher to sport a wig: René Descartes and John Locke did so as well. Theirs were not quite so extravagant and luxurious as Leibniz’s, however, nor did they give quite the same impression that a poodle had curled up for a nap on the wearer’s head. In the portrait reproduced here, by the fashionable court painter Christoph Bernhard Francke, Leibniz’s peruke complements the rich velvet folds of his garment to project an aura of prosperity, prestige, and fashion. Leibniz, who was fond of perfume as well as of perukes, made no bones about his wish to be included in polite society. The duke of Orleans was sufficiently impressed with his elegance to declare: “It is unusual for intellectuals to dress well, not to smell bad, and to understand jokes.”

Leibniz’s peruke silently poses questions: Should philosophers concern themselves with reputation, physical appearance, and fashion in the way that Leibniz does? Shouldn’t the philosopher focus rather on the disinterested pursuit of truth? Ever since Diogenes the Cynic, poverty and simplicity have served as emblems of philosophical authenticity. If we no longer demand that our philosophers be poor, we expect at least a certain slovenliness—a sign that their attention is directed elsewhere, upon more fundamental matters, and not on their appearance.

John Locke seems to make a related point in the dedicatory epistle to An Essay Concerning Human Understanding: “The Imposition of Novelty is a terrible Charge among those, who judge of Men’s Heads as they do their Perukes, by the Fashion; and can allow none to be right, but the received Doctrines.” The philosopher is supposed to be defined by what goes on in his or her head, not by what is perched upon it. Philosophers pursue truth, but the wig is an emblem of falsehood. The philosopher investigates eternal verities, but the wig occupies the ephemeral realm of fashion. In The Wig: A Harebrained History (London: Reaktion, 2020), Luigi Amara posits the wig as the supremely antiphilosophical object, more at home with the deceptive rhetorical chicanery of the Sophists (and for that reason also a supreme philosophical provocation).

But if philosophy and wigs are conceptually incompatible, this fact did not seem to bother Leibniz, who was perfectly comfortable with both. I would like to suggest, indeed, that the wig takes on enhanced significance if juxtaposed not only to philosophy in general but also to Leibniz’s philosophy in particular. One of the things Leibnizian metaphysics does is take Cartesian dualism and push it to an extreme: bodies and minds not only are of essentially different natures, as Descartes held, but also because of this they do not interact at all. But if bodies and minds cannot affect one another causally, Lebiniz argued, they nevertheless express each other. Every mental event is accompanied by some change in the bodily state of the entity experiencing it and vice versa. These expressive relations are not the result of direct mutual influence but are created by God as part of what Leibniz called pre-established harmony. In place of causal relations between mind and body, then, Leibniz posits something more like aesthetic ones.

Leibniz’s philosophy would claim, therefore, that his own bodily appearance is not unrelated to what goes on in his head…

On the philosophical Importance of fake hair: “Leibniz’s Peruke,” from @ColumbiaUP.

* Gottfried Wilhelm Leibniz

###

As we ruminate on rugs, we might send insightful birthday greetings to Robert Pirsig; he was born on this date in 1928. A writer and philosopher, he is best known for Zen and the Art of Motorcycle Maintenance: An Inquiry into Values, an exploration of the underlying metaphysics of Western culture.

Pirsig had great difficulty finding a publisher for Zen and the Art of Motorcycle Maintenance. He pitched the idea for his book to 121 different publishers, sending them a cover letter along with two sample pages; only 22 responding favorably, and then only tentatively.  Ultimately, an editor at William Morrow accepted the finished manuscript; when he did, his publisher’s internal recommendation averred, “This book is brilliant beyond belief, it is probably a work of genius, and will, I’ll wager, attain classic stature.”  Indeed, in his review, George Steiner compared Pirsig’s writing to Dostoevsky, Broch, Proust, and Bergson, arguing that “the assertion itself is valid … the analogies with Moby-Dick are patent.”

source

“Life is a Zen koan, that is, an unsolvable riddle. But the contemplation of that riddle – even though it cannot be solved – is, in itself, transformative.”*…

How hard is it to prove that problems are hard to solve? Meta-complexity theorists have been asking questions like this for decades. And as Ben Brubaker explains, a string of recent results has started to deliver answers…

… Even seasoned researchers find understanding in short supply when they confront the central open question in theoretical computer science, known as the P versus NP problem. In essence, that question asks whether many computational problems long considered extremely difficult can actually be solved easily (via a secret shortcut we haven’t discovered yet), or whether, as most researchers suspect, they truly are hard. At stake is nothing less than the nature of what’s knowable.

Despite decades of effort by researchers in the field of computational complexity theory — the study of such questions about the intrinsic difficulty of different problems — a resolution to the P versus NP question has remained elusive. And it’s not even clear where a would-be proof should start.

“There’s no road map,” said Michael Sipser, a veteran complexity theorist at the Massachusetts Institute of Technology who spent years grappling with the problem in the 1980s. “It’s like you’re going into the wilderness.”

It seems that proving that computational problems are hard to solve is itself a hard task. But why is it so hard? And just how hard is it? Carmosino and other researchers in the subfield of meta-complexity reformulate questions like this as computational problems, propelling the field forward by turning the lens of complexity theory back on itself.

“You might think, ‘OK, that’s kind of cool. Maybe the complexity theorists have gone crazy,’” said Rahul Ilango, a graduate student at MIT who has produced some of the most exciting recent results in the field.

By studying these inward-looking questions, researchers have learned that the hardness of proving computational hardness is intimately tied to fundamental questions that may at first seem unrelated. How hard is it to spot hidden patterns in apparently random data? And if truly hard problems do exist, how often are they hard?

“It’s become clear that meta-complexity is close to the heart of things,” said Scott Aaronson, a complexity theorist at the University of Texas, Austin.

This is the story of the long and winding trail that led researchers from the P versus NP problem to meta-complexity. It hasn’t been an easy journey — the path is littered with false turns and roadblocks, and it loops back on itself again and again. Yet for meta-complexity researchers, that journey into an uncharted landscape is its own reward. Start asking seemingly simple questions, said Valentine Kabanets, a complexity theorist at Simon Fraser University in Canada, and “you have no idea where you’re going to go.”…

Complexity theorists are confronting their most puzzling problem yet– complexity theory itself: “Complexity Theory’s 50-Year Journey to the Limits of Knowledge,” from @benbenbrubaker in @QuantaMagazine.

* Tom Robbins

###

As we limn limits, we might send thoroughly cooked birthday greetings to Denis Papin; he was born on this date in 1647. A mathematician and physicist who worked with  Christiaan Huygens and Gottfried Leibniz, Papin is better remembered as the inventor of the steam digester, the forerunner of the pressure cooker and of the steam engine.

source

“Machines take me by surprise with great frequency”*…

In search of universals in the 17th century, Gottfried Leibniz imagined the calculus ratiocinator, a theoretical logical calculation framework aimed at universal application, that led Norbert Wiener suggested that Leibniz should be considered the patron saint of cybernetics. In the 19th century, Charles Babbage and Ada Lovelace took a pair of whacks at making it real.

Ironically, it was confronting the impossibility of a universal calculator that led to modern computing. In 1936 (the same year that Charlie Chaplin released Modern Times) Alan Turing (following on Godel’s demonstration that mathematics is incomplete and addressing Hilbert‘s “decision problem,” querying the limits of computation) published the (notional) design of a “machine” that elegantly demonstrated those limits– and, as Sheon Han explains, birthed computing as we know it…

… [Hilbert’s] question would lead to a formal definition of computability, one that allowed mathematicians to answer a host of new problems and laid the foundation for theoretical computer science.

The definition came from a 23-year-old grad student named Alan Turing, who in 1936 wrote a seminal paper that not only formalized the concept of computation, but also proved a fundamental question in mathematics and created the intellectual foundation for the invention of the electronic computer. Turing’s great insight was to provide a concrete answer to the computation question in the form of an abstract machine, later named the Turing machine by his doctoral adviser, Alonzo Church. It’s abstract because it doesn’t (and can’t) physically exist as a tangible device. Instead, it’s a conceptual model of computation: If the machine can calculate a function, then the function is computable.

With his abstract machine, Turing established a model of computation to answer the Entscheidungsproblem, which formally asks: Given a set of mathematical axioms, is there a mechanical process — a set of instructions, which today we’d call an algorithm — that can always determine whether a given statement is true?…

… in 1936, Church and Turing — using different methods — independently proved that there is no general way of solving every instance of the Entscheidungsproblem. For example, some games, such as John Conway’s Game of Life, are undecidable: No algorithm can determine whether a certain pattern will appear from an initial pattern.

Beyond answering these fundamental questions, Turing’s machine also led directly to the development of modern computers, through a variant known as the universal Turing machine. This is a special kind of Turing machine that can simulate any other Turing machine on any input. It can read a description of other Turing machines (their rules and input tapes) and simulate their behaviors on its own input tape, producing the same output that the simulated machine would produce, just as today’s computers can read any program and execute it. In 1945, John von Neumann proposed a computer architecture — called the von Neumann architecture — that made the universal Turing machine concept possible in a real-life machine…

As Turing said, “if a machine is expected to be infallible, it cannot also be intelligent.” On the importance of thought experiments: “The Most Important Machine That Was Never Built,” from @sheonhan in @QuantaMagazine.

* Alan Turing

###

As we sum it up, we might spare a thought for Martin Gardner; he died on this date in 2010.  Though not an academic, nor ever a formal student of math or science, he wrote widely and prolifically on both subjects in such popular books as The Ambidextrous Universe and The Relativity Explosion and as the “Mathematical Games” columnist for Scientific American. Indeed, his elegant– and understandable– puzzles delighted professional and amateur readers alike, and helped inspire a generation of young mathematicians.

Gardner’s interests were wide; in addition to the math and science that were his power alley, he studied and wrote on topics that included magic, philosophy, religion, and literature (c.f., especially his work on Lewis Carroll– including the delightful Annotated Alice— and on G.K. Chesterton).  And he was a fierce debunker of pseudoscience: a founding member of CSICOP, and contributor of a monthly column (“Notes of a Fringe Watcher,” from 1983 to 2002) in Skeptical Inquirer, that organization’s monthly magazine.

 source

Written by (Roughly) Daily

May 22, 2023 at 1:00 am

“Speed and acceleration are merely the dream of making time reversible”*…

In the early 20th century, there was Futurism…

The Italian Futurists, from the first half of the twentieth century… wanted to drive modernisation in turn-of-the-century Italy at a much faster pace. They saw the potential in machines, and technology, to transform the country, to demand progress. It was not however merely an incrementalist approach they were after: words like annihilation, destruction and apocalypse appear in the writings of the futurists, including the author of The Futurist Manifesto, Filippo Tomasso Marinetti. ‘We want to glorify war – the only cure for the world…’ Marinetti proclaimed – this was not for the faint hearted! That same Marinetti was the founder of the Partito Politico Futuristo in 1918, which became part of Mussolini’s Fascist party in 1919. Things did not go well after that.

Beautiful Ideas Which Kill: Accelerationism, Futurism and Bewilderment

And now, in the early 21st century, there is Accelerationism…

These [politically-motivated mass] killings were often linked to the alt-right, described as an outgrowth of the movement’s rise in the Trump era. But many of these suspected killers, from Atomwaffen thugs to the New Zealand mosque shooter to the Poway synagogue attacker, are more tightly connected to a newer and more radical white supremacist ideology, one that dismisses the alt-right as cowards unwilling to take matters into their own hands.

It’s called “accelerationism,” and it rests on the idea that Western governments are irreparably corrupt. As a result, the best thing white supremacists can do is accelerate their demise by sowing chaos and creating political tension. Accelerationist ideas have been cited in mass shooters’ manifestos — explicitly, in the case of the New Zealand killer — and are frequently referenced in white supremacist web forums and chat rooms.

Accelerationists reject any effort to seize political power through the ballot box, dismissing the alt-right’s attempts to engage in mass politics as pointless. If one votes, one should vote for the most extreme candidate, left or right, to intensify points of political and social conflict within Western societies. Their preferred tactic for heightening these contradictions, however, is not voting, but violence — attacking racial minorities and Jews as a way of bringing us closer to a race war, and using firearms to spark divisive fights over gun control. The ultimate goal is to collapse the government itself; they hope for a white-dominated future after that…

Accelerationism: the obscure idea inspiring white supremacist killers around the world” (and source of the image above)

See also: “A Year After January 6, Is Accelerationism the New Terrorist Threat?

For a look at the “intellectual” roots of accelerationism, see “Accelerationism: how a fringe philosophy predicted the future we live in.”

For a powerful articulation of the dangers of Futurism (and even more, Acclerationism), see “The Perils of Smashing the Past.”

And for a reminder of the not-so-obvious ways that movements like these live on, see “The Intentionally Scandalous 1932 Cookbook That Stands the Test of Time,” on The Futurist Cookbook, by Futurist Manifesto author Filippo Tommaso Marinetti… which foreshadowed the “food as fuel” culinary movements that we see today.

* Jean Baudrillard

###

As we slow down, we might send a “Alles Gute zum Geburtstag” to the polymathic Gottfried Wilhelm Leibniz, the philosopher, mathematician, and political adviser, who was important both as a metaphysician and as a logician, but who is probably best remembered for his independent invention of the calculus; he was born on this date in 1646.  Leibniz discovered and developed differential and integral calculus on his own, which he published in 1684; but he became involved in a bitter priority dispute with Isaac Newton, whose ideas on the calculus were developed earlier (1665), but published later (1687).

As it happens, Leibnitz was a wry and incisive political and cultural observer.  Consider, e.g…

If geometry conflicted with our passions and our present concerns as much as morality does, we would dispute it and transgress it almost as much–in spite of all Euclid’s and Archimedes’ demonstrations, which would be treated as fantasies and deemed to be full of fallacies. [Leibniz, New Essays, p. 95]

28134677537_d79a889e6a_o

 source

That which hath made them drunk hath made me bold*…

From Jesse Gaynor in The Paris Review, “Drunk Texts from Famous Authors.”

[TotH to EWW]

* Macbeth, Act2, Scene 2

***

As we get dressed for the party celebrating the births on this date of both Gottfried Wilhelm Leibniz (1646) and Alan Turing (1912), we might spare a moment to wish a Feliz Cumpleaños to Joseph Hill “Joss” Whedon; he was born on this date in 1964.  An award winning writer, producer, director, composer, and comic creator, Whedon is probably best known as the director of this Summer’s smash, The Avengers, and as the creator-writer-director of such critically-acclaimed television series as Firefly, Buffy the Vampire Slayer, and Dollhouse.  It’s less well remembered that he was nominated for an Academy Award for his screenplay for Toy Story, the film that launched Pixar’s feature animation juggernaut.

 source

Written by (Roughly) Daily

June 23, 2012 at 1:01 am