Posts Tagged ‘alchemy’
“In all chaos there is a cosmos, in all disorder a secret order”*…

Between 1617 and 1621 the English physician and polymath Robert Fludd published his masterpiece Utriusque Cosmi . . . Historia, a two-volume work packed with over sixty intricate engravings. Urszula Szulakowska looks at the philosophical and theological ideas behind the extraordinary images found in the first volume, an exploration of the macrocosm of the universe and spiritual realm…
Robert Fludd was a respected English physician (of Welsh origins) employed at the court of King James I of England. He was a prolific writer of vast, multi-volume encyclopaedias in which he discussed a universal range of topics from magical practices — such as alchemy, astrology, kabbalism, and fortune-telling — to radical theological thinking concerning the interrelation of God with the natural and human worlds. However, he also proudly displayed his grasp of practical knowledge, such as mechanics, architecture, military fortifications, armaments, military manoeuvres, hydrology, musical theory and musical instruments, mathematics, geometry, optics, and the art of drawing, as well as chemistry and medicine. Fludd used the common metaphor for the arts as being the “ape of Nature”, a microcosmic form of how the universe itself functioned.
Fludd’s most famous work is the History of the Two Worlds (Utriusque Cosmi . . . Historia, 1617-21) published in two volumes by Theodore de Bry in Oppenheim. The two worlds under discussion are those of the Microcosm of human life on earth and the Macrocosm of the universe (which included the spiritual realm of the Divine).
Fludd himself was a staunch member of the Anglican Church. He was educated in the medical profession at St. John’s College in Oxford. At the turn of the seventeenth century, he set out for an extended period of travel on the continent. He spent a winter with some Jesuits, a Roman Catholic order deeply opposed to Protestantism who, nevertheless, tutored Fludd on magical practices. Fludd, however, always claimed to have worked out the theological and magical systems in his first volume of the Utriusque Cosmi . . . Historia (1617) during his undergraduate days at Oxford. In this work Fludd devised a lavishly illustrated cosmology based on the chemical theory of Paracelsus, in which the materials of the universe were separated out of chaos by God who acted in the manner of a laboratory alchemist…
All of Fludd’s treatises were lavishly illustrated with extraordinary engravings, unique in their form and subject matter, which have the visionary quality of a genuine spiritual seer and which exerted an influence on his contemporary occultists such as Michael Maier, Jacob Boehme, and Johannes Mylius. Fludd himself designed these images and they were engraved by the artisans employed at his publishers. (Some of his own original drawings still exist for the first volume of the Utriusque Cosmi . . . Historia, 1617)…

Read on for more explanation and many more mesmerizing images. A 17th century “theory of everything”: “Robert Fludd and His Images of The Divine,” in @PublicDomainRev.
* Carl Jung
###
As we think holistically, we might recall that it was on this date in 1859 that our perspective was shifted in a different kind of way: Charles Darwin published The Origin of the Species. Actually, on that day he published On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life; the title was shortened to the one we know with the sixth edition in 1872.
“Alchemy. The link between the immemorial magic arts and modern science. Humankind’s first systematic effort to unlock the secrets of matter by reproducible experiment.”*…
Science has entered a new era of alchemy, suggests Robbert Dijkgraaf, Director of the Institute for Advanced Study at Princeton– and, he argues, that’s a good thing…
Is artificial intelligence the new alchemy? That is, are the powerful algorithms that control so much of our lives — from internet searches to social media feeds — the modern equivalent of turning lead into gold? Moreover: Would that be such a bad thing?
According to the prominent AI researcher Ali Rahimi and others, today’s fashionable neural networks and deep learning techniques are based on a collection of tricks, topped with a good dash of optimism, rather than systematic analysis. Modern engineers, the thinking goes, assemble their codes with the same wishful thinking and misunderstanding that the ancient alchemists had when mixing their magic potions.
It’s true that we have little fundamental understanding of the inner workings of self-learning algorithms, or of the limits of their applications. These new forms of AI are very different from traditional computer codes that can be understood line by line. Instead, they operate within a black box, seemingly unknowable to humans and even to the machines themselves.
This discussion within the AI community has consequences for all the sciences. With deep learning impacting so many branches of current research — from drug discovery to the design of smart materials to the analysis of particle collisions — science itself may be at risk of being swallowed by a conceptual black box. It would be hard to have a computer program teach chemistry or physics classes. By deferring so much to machines, are we discarding the scientific method that has proved so successful, and reverting to the dark practices of alchemy?
Not so fast, says Yann LeCun, co-recipient of the 2018 Turing Award for his pioneering work on neural networks. He argues that the current state of AI research is nothing new in the history of science. It is just a necessary adolescent phase that many fields have experienced, characterized by trial and error, confusion, overconfidence and a lack of overall understanding. We have nothing to fear and much to gain from embracing this approach. It’s simply that we’re more familiar with its opposite.
After all, it’s easy to imagine knowledge flowing downstream, from the source of an abstract idea, through the twists and turns of experimentation, to a broad delta of practical applications. This is the famous “usefulness of useless knowledge,” advanced by Abraham Flexner in his seminal 1939 essay (itself a play on the very American concept of “useful knowledge” that emerged during the Enlightenment).
A canonical illustration of this flow is Albert Einstein’s general theory of relativity. It all began with the fundamental idea that the laws of physics should hold for all observers, independent of their movements. He then translated this concept into the mathematical language of curved space-time and applied it to the force of gravity and the evolution of the cosmos. Without Einstein’s theory, the GPS in our smartphones would drift off course by about 7 miles a day.
But maybe this paradigm of the usefulness of useless knowledge is what the Danish physicist Niels Bohr liked to call a “great truth” — a truth whose opposite is also a great truth. Maybe, as AI is demonstrating, knowledge can also flow uphill.
In the broad history of science, as LeCun suggested, we can spot many examples of this effect, which can perhaps be dubbed “the uselessness of useful knowledge.” An overarching and fundamentally important idea can emerge from a long series of step-by-step improvements and playful experimentation — say, from Fröbel to Nobel.
Perhaps the best illustration is the discovery of the laws of thermodynamics, a cornerstone of all branches of science. These elegant equations, describing the conservation of energy and increase of entropy, are laws of nature, obeyed by all physical phenomena. But these universal concepts only became apparent after a long, confusing period of experimentation, starting with the construction of the first steam engines in the 18th century and the gradual improvement of their design. Out of the thick mist of practical considerations, mathematical laws slowly emerged…
One could even argue that science itself has followed this uphill path. Until the birth of the methods and practices of modern research in the 17th century, scientific research consisted mostly of nonsystematic experimentation and theorizing. Long considered academic dead ends, these ancient practices have been reappraised in recent years: Alchemy is now considered to have been a useful and perhaps even necessary precursor to modern chemistry — more proto-science than hocus-pocus.
The appreciation of tinkering as a fruitful path toward grand theories and insights is particularly relevant for current research that combines advanced engineering and basic science in novel ways. Driven by breakthrough technologies, nanophysicists are tinkering away, building the modern equivalents of steam engines on the molecular level, manipulating individual atoms, electrons and photons. Genetic editing tools such as CRISPR allow us to cut and paste the code of life itself. With structures of unimaginable complexity, we are pushing nature into new corners of reality. With so many opportunities to explore new configurations of matter and information, we could enter a golden age of modern-day alchemy, in the best sense of the word.
However, we should never forget the hard-won cautionary lessons of history. Alchemy was not only a proto-science, but also a “hyper-science” that overpromised and underdelivered. Astrological predictions were taken so seriously that life had to adapt to theory, instead of the other way around. Unfortunately, modern society is not free from such magical thinking, putting too much confidence in omnipotent algorithms, without critically questioning their logical or ethical basis.
Science has always followed a natural rhythm of alternating phases of expansion and concentration. Times of unstructured exploration were followed by periods of consolidation, grounding new knowledge in fundamental concepts. We can only hope that the current period of creative tinkering in artificial intelligence, quantum devices and genetic editing, with its cornucopia of useful applications, will eventually lead to a deeper understanding of the world…
Today’s powerful but little-understood artificial intelligence breakthroughs echo past examples of unexpected scientific progress: “The Uselessness of Useful Knowledge,” from @RHDijkgraaf at @the_IAS.
Pair with: “Neuroscience’s Existential Crisis- we’re mapping the brain in amazing detail—but our brain can’t understand the picture” for a less optimistic view.
* John Ciardi
###
As we experiment, we might recall that it was on this date in 1993 that the Roman Catholic Church admitted that it had erred in condemning Galileo. For over 359 years, the Church had excoriated Galileo’s contentions (e.g., that the Earth revolves around the Sun) as anti-scriptural heresy. In 1633, at age 69, Galileo had been forced by the Roman Inquisition to repent, and spent the last eight years of his life under house arrest. After 13 years of inquiry, Pope John Paul II’s commission of historic, scientific and theological scholars brought the pontiff a “not guilty” finding for Galileo; the Pope himself met with the Pontifical Academy of Sciences to help correct the record.

“The ‘paradox’ is only a conflict between reality and your feeling of what reality ‘ought to be’”*…
One of the most bizarre aspects of quantum physics is that the fundamental entities that make up the Universe, what we know as the indivisible quanta of reality, behave as both a wave and a particle. We can do certain experiments, like firing photons at a sheet of metal, where they act like particles, interacting with the electrons and kicking them off only if they individually have enough energy. Other experiments, like firing photons at small thin objects — whether slits, hairs, holes, spheres, or even DVDs — give patterned results that show exclusively wave-like behavior. What we observe appears to depend on which observations we make, which is frustrating, to say the least. Is there some way to tell, fundamentally, what the nature of a quanta is, and whether it’s wave-like or particle-like at its core?
That’s what Sandra Marin wants to know, asking:
“I wonder if you could help me to understand John Wheeler – the delayed choice experiment and write an article about this.”
John Wheeler was one of the most brilliant minds in physics in the 20th century, responsible for enormous advances in quantum field theory, General Relativity, black holes, and even quantum computing. Yet the idea about the delayed choice experiment hearkens all the way back to perhaps our first experience with the wave-particle duality of quantum physics: the double-slit experiment…
Although Einstein definitively wanted us to have a completely comprehensible reality, where everything that occurred obeyed our notions of cause-and-effect without any retrocausality, it was his great rival Bohr who turned out to be correct on this point. In Bohr’s own words:
“…it…can make no difference, as regards observable effects obtainable by a definite experimental arrangement, whether our plans for constructing or handling the instruments are fixed beforehand or whether we prefer to postpone the completion of our planning until a later moment when the particle is already on its way from one instrument to another.”
As far as we can tell, there is no one true objective, deterministic reality that exists independently of observers or interactions. In this Universe, you really to have to observe in order to find out what you get.
The history and the results of John Wheeler‘s famous “delayed choice” experiments: “Is Light Fundamentally A Wave Or A Particle?“
* Richard Feynman
###
As we reconsider categories, we might recall that it was on this date in 1404 that King Henry IV signed the “Act Against Multipliers,” stipulating that “None from hereafter shall use to multiply gold or silver, or use the craft of multiplication; and if any the same do, they incur the pain of felony.” Great alarm was felt at that time lest any alchemist should succeed in “transmutation” (the conversion of a base metal into gold or silver), thus undermining the sanctity of the Royal currency and/or possibly financing rebellious uprisings. Alchemy, which had flourished since the time of Bacon, effectively became illegal.
The Act was repealed in 1689, when Robert Boyle, the father of modern chemistry, and other members of the vanguard of the scientific revolution lobbied for its repeal.
You must be logged in to post a comment.