Posts Tagged ‘Science’
“Darling, it’s better / down where it’s wetter”*…
From our old friend Neal Agarwal, a long (and illuminating) scroll from the surface to the “bottom” of the ocean (10,924 meters down)…
“The Deep Sea,” from @neal.fun.
* “Sebastian, “Under the Sea,” The Little Mermaid
###
As we dive: we might send connected birthday greetings to a man who put the sea floor to use, Clarence Mackay; he was born on this date in 1874. An early telecom entrepreneur, he supervised the completion of the first transpacific cable between the United States and the Far East in 1904. He laid a cable between New York and Cuba in 1907, and later established cable communication with southern Europe via the Azores and with northern Europe via Ireland. And in 1928, he became the first to combine radio, cables, and telegraphs under one management. (He sold his business, Postal Telegraph and Cable Corporation, to International Telephone and Telegraph Company [ITT] for an enormous amount of stock… just in time for the stock market crash in 1929, which wiped him out. He survived the Great Depression by selling his art and antiques.)
“Quantum computation is … nothing less than a distinctly new way of harnessing nature”*…
As the tools in the world around us change, the world– and we– change with them. The onslaught of AI is the change that seems to be grabbing most of our mindshare these days… and with reason. But there are, of course, other changes (in biotech, in materials science, et al.) that are also going to be hugely impactful.
Today, a look at the computing technology stalking up behind AI: quantum computing. As enthusiasts like David Deutsch (author of the quote above) argue, it can have tremendous benefits, perhaps especially in our ability to model (and thus better understand) our reality.
But quantum computing will, if/when it arrives, also present huge challenges to us as individuals and as societies– perhaps most prominently in its threat to the ways in which we protect our systems and our information: We’ve felt pretty safe for decades, secure in the knowledge that we could lose passwords to phising or hacks, but that it would take the “classical” computers we have 1 billion years to break today’s RSA-2048 encryption. A quantum computer could crack it in as little as a hundred seconds.
The technology has been “somewhere on the horizon” for 30 years… so not something that has seemed urgent to confront. But progress has accelerated; a recent Google paper reports on a programming and architectural breakthrough that greatly reduces the computing resources necessary to break classical cryptography… putting the prospect of “Q-Day” (the point at which quantum computers become powerful enough to break standard encryption methods (RSA, ECC), endangering global digital security) much closer, which would put everything from crypto-wallets to our e-banking accounts at risk.
Charlie Wood brings us up to speed…
Some 30 years ago, the mathematician Peter Shor took a niche physics project — the dream of building a computer based on the counterintuitive rules of quantum mechanics — and shook the world.
Shor worked out a way for quantum computers to swiftly solve a couple of math problems that classical computers could complete only after many billions of years. Those two math problems happened to be the ones that secured the then-emerging digital world. The trustworthiness of nearly every website, inbox, and bank account rests on the assumption that these two problems are impossible to solve. Shor’s algorithm proved that assumption wrong.
For 30 years, Shor’s algorithm has been a security threat in theory only. Physicists initially estimated that they would need a colossal quantum machine with billions of qubits — the elements used in quantum calculations — to run it. That estimate has come down drastically over the years, falling recently to a million qubits. But it has still always sat comfortably beyond the modest capabilities of existing quantum computers, which typically have just hundreds of qubits.
However, two different groups of researchers have just announced advances that notably reduce the gap between theoretical estimates and real machines. A star-studded team of quantum physicists at the California Institute of Technology went public with a design for a quantum computer that could break encryption with only tens of thousands of qubits and said that it had formed a company to build the machine. And researchers at Google announced that they had developed an implementation of Shor’s algorithm that is ten times as efficient as the best previous method.
Neither company has the hardware to break encryption today. But the results underscore what some quantum physicists had already come to suspect: that powerful quantum computers may be years away, rather than decades. “If you care about privacy or you have secrets, then you better start looking for alternatives,” said Nikolas Breuckmann, a mathematical physicist at the University of Bristol, who did not work on either of the papers.
While the new results may provide a jolt for the policymakers and corporations that guard our digital infrastructure, they also signal the rapid progress that physicists have made toward building machines that will let them more thoroughly explore the quantum world.
“We’re going to actually do this,” said Dolev Bluvstein, a Caltech physicist and CEO of the new company, Oratomic…
[Wood unpacks the history of the development of the technology and explores the challenges that remain; he concludes…]
… If any group succeeds at building a quantum computer that can realize Shor’s algorithm, it will mark the end an era — specifically, the “Noisy Intermediate Scale Quantum” era, as Preskill dubbed the pre-error-correction period in a 2018 paper. Each researcher has a vision for what to pursue first with a machine in the new “fault-tolerant” era.
[Robert] Huang said he would start by running Shor’s algorithm, just to prove that the device works. After that, he said he would try to use it to speed up machine learning — an application to be detailed in coming work.
Most of the architects building quantum computers, whether at Oratomic or other startups, are physicists at heart. They’re interested in physics, not cryptography. Specifically, they’re interested in all the things a computer fluent in the language of quantum mechanics could teach them about the quantum realm, such as what sort of materials might become superconductors even at warm temperatures. Preskill, for his part, would like to simulate the quantum nature of space-time.
The Caltech group knows it has years of work ahead before any of its dreams have a chance of coming true. But the researchers can’t wait to get started. “Pick a cooler life quest than building the world’s first quantum computer with your friends!” said a jubilant Bluvstein, reached by phone shortly before their paper went live, before rushing off to celebrate…
Eminently worth reading in full: “New Advances Bring the Era of Quantum Computers Closer Than Ever,” from @walkingthedot.bsky.social in @quantamagazine.bsky.social.
* David Deutsch, The Fabric of Reality
###
As we prepare, we might take a moment to appreciate just how vastly and deeply the legacy systems challenged by quantum computing run, recalling that on this date in 1959 Mary Hawes, a computer scientist for the Burroughs Corporation held a meeting of computers users, manufacturers, and academics at the University of Pennsylvania aimed at creating a common business oriented programming language. At the meeting, representative Grace Hopper suggested that they ask the Department of Defense to fund the effort to create such a language. Also attending was Charles Phillips who was director of the Data System Research Staff at the DoD and was excited by the possibility of a common language streamlining their operations. He agreed to sponsor the creation of such a language. This was the genesis of what would eventually become the COBOL language.
To this day COBOL is still the most common programming language used in business, finance, and administrative systems for companies and governments, primarily on mainframe systems, with around 200 billion lines of code still in production use… all of which are in question and/or at risk in a world of quantum computing.
“In the last analysis, a pickle is a cucumber with experience”*…
In an excerpt from their book, The Pickled City: The Story of New York Pickles, Paul van Ravestein and Monique Mulder explore the evolution of fermentation across the ages…
Pickling vegetables began in Mesopotamia around 2400 BCE, where brining cucumbers addressed the challenge of preserving food in a hot climate. Brine, a mixture of water and salt, proved effective at inhibiting spoilage while enhancing the flavor of food. This innovation quickly spread to neighboring civilizations, embedding itself in the culinary practices of ancient Egypt, Greece, and Rome.
Cleopatra, one of Egypt’s most iconic figures, believed that pickles contributed to her legendary beauty. This association between pickles and vitality reflected a broader cultural fascination with preserved foods. Julius Caesar ensured that his soldiers carried pickles on their campaigns, claiming that the preserved vegetables fortified their strength and stamina. This notion of pickles as both nourishment and tonic was echoed by Greek philosophers like Aristotle, who praised their medicinal properties.
The culinary sophistication of ancient Rome brought pickling into sharper focus. The Roman cookbook attributed to the Roman merchant and epicure Apicius, De Re Coquinaria, included numerous references to preserved vegetables, particularly olives and cucumbers. Apicius wrote of the importance of balance in brining, using spices like dill, mustard seed, and coriander seed to create complex flavors that complemented meals. The ability to elevate simple ingredients through preservation became a hallmark of Roman gastronomy, showcasing pickling as both art and science.
The spread of pickling innovations along trade routes like the Silk Road and the Spice Route highlights its significance in cultural exchange. Roman traders, for example, likely encountered Asian pickling techniques through the Silk Road’s bustling networks of goods and ideas. Spices such as cinnamon, peppercorns, and cumin—integral to pickling recipes—traveled vast distances, linking the culinary practices of the Mediterranean, India, and China.
In Asia, pickling developed independently but with striking parallels. Chinese records from the Zhou Dynasty (1046-256 BCE) mention fermented vegetables, including pickled radishes and cabbages, which were essential for sustenance during harsh winters. Similarly, Indian achar evolved as a culinary treasure, incorporating local spices like turmeric, fenugreek, and mustard to enhance preservation and flavor. Japanese pickling methods, such as nukazuke (fermentation in rice bran), emphasized minimalism and balance, reflecting the cultural values of harmony and simplicity.
The maritime trade routes of Southeast Asia and the Indian Ocean allowed pickling traditions to travel across vast regions, influencing cuisines from the Malay Archipelago to the Arabian Peninsula. The Indian Ocean trade ensured that spices like cloves and nutmeg became integral to pickling recipes worldwide, enriching their flavor profiles and preserving their cultural legacies.
Pickling’s role extended beyond culinary practices, becoming intertwined with religious and cultural rituals. In Jewish tradition, the Talmud makes numerous references to pickled vegetables, particularly turnips, which symbolize abundance and endurance. Pickled foods often accompanied bread during blessings, emphasizing their role as both sustenance and spiritual connection.
Their transformation through pickling—turning a simple, earthy root into a tangy, vibrant dish—was often seen as a metaphor for renewal and the endurance of the Jewish people through adversity. During the springtime Jewish holiday of Purim, which commemorates the triumph of the Jewish people over oppression in ancient Persia, the giving of food gifts (mishloach manot) occasionally included pickled vegetables, reflecting the value of sharing preserved foods that sustained communities through lean times. And colorful Yiddish sayings like er drayt sich arum vie a forts in roosl (he wanders around like a fart in a pickle barrel) highlight the humorous associations with pickling, eating, and bodily functions.
Hindu culture imbued pickles with sacred meaning. The balance of flavors in achar—salty, sour, sweet, and spicy—was seen as a reflection of life’s harmony. Pickles were often prepared as offerings during religious festivals, symbolizing prosperity and the nurturing of the human spirit.
Christian monastic traditions adopted pickling during the Middle Ages as a way to sustain communities through long fasting periods. Pickled fish and vegetables became essential components of monastic diets, reflecting the intersection of faith, practicality, and culinary ingenuity.
In Islamic cultures, pickles played a central role in Ramadan feasts, their tangy flavors providing refreshment after a day of fasting. Preserved lemons, a staple in Moroccan cuisine, became symbolic of hospitality and were often served to honored guests. Ancient Chinese rituals also celebrated the cultural significance of pickling, with fermented vegetables used in ancestor worship as symbols of continuity and filial piety.
Trade routes such as the Silk Road and those across the Sahara were pivotal in spreading pickling techniques and ingredients across diverse cultures. These routes facilitated the exchange of goods like salt and vinegar, essential to pickling, along with the culinary knowledge that transformed them into staples of global cuisine…
Read on for medieval and early modern innovations, pickling evolution in the Eighteenth and Nineteenth Centuries, industrialization and the modern culinary renaissance, and pickles in pop culture: “A Brief and Essential History of the Most Important Food Ever Invented: The Pickle,” from @lithub.com.web.brid.gy.
###
As we break out the brine, we might spare a thought for a man who put fermentation to a different kind of use, André Tchelistcheff; he died on this date in 1994. An oenologist, he was a pivotal figure in the revitalization of the California wine industry following Prohibition (1919-33) and used his (French) training in viticulture and wine-making to define the style of California’s best wines, especially Cabernet Sauvignon, and to pioneer such techniques as the cold fermentation (now widely used in producing white and rose wines) and the use of American oak barrels for aging. He also developed frost-prevention techniques and helped curb vine disease in Napa Valley. In addition to managing Beaulieu Vineyards in Napa for 35 years, Tchelistcheff operated a private wine laboratory in St. Helena for 15 years. He also assembled a fabled library of wine literature.
“The earth is bountiful, and where her bounty fails, nitrogen drawn from the air will refertilize her womb.”*…
As the Iran War continues to unfold, there is understandably a great deal of concern about energy prices (and the prices of things that depend on energy). We might forget that the Middle East is also crucial to the world’s fertilizer supply– though not for long, as farmers (along with everyone else in the food chain, all the way down to all of us eaters) are beginning to feel the pain.
But, as Diana Kruzman reports, even as fertilizer trade concerns are growing, a revolutionary sourcing alternative has emerged– one that could make a huge positive difference if it proves out at scale…
The world has an almost insatiable demand for nitrogen. Crops need it to grow, but although it makes up 78 percent of our atmosphere, plants can’t just pull it in from the air the way they do with oxygen. Instead, they rely on bacteria in the soil to convert it into nitrate, a form they can use; in the case of agriculture, think of fertilizer spread by humans. Leaving aside organic options like cow manure, most farmers use ammonia produced mainly from natural gas using a technique called the Haber-Bosch process, which was invented in 1909. [See also here.]
Haber-Bosch is expensive and energy-intensive, responsible for up to two percent of the world’s annual greenhouse gas emissions. It’s also spurred a global nitrogen pollution crisis; as much as two-thirds of nitrogen fertilizer applied to crops is never used, and the excess escapes into the soil, air, and water, raising the cancer risk in nearby communities and contributing to climate change.
Researchers have been trying to find an alternative way to get nitrogen to plants for decades — turning to everything from microbes to human urine. But so far, these scientific advancements haven’t translated into much practical change for farmers, who for the most part still rely on ammonia (which, granted, is getting greener, but is increasingly vulnerable to global price shocks).
That could soon change with the growth in popularity of a new technology known as plasma activated water, or PAW. Around the U.S., scientists and startups are experimenting with this high-tech solution, which uses electricity to pull nitrogen from the air, mix it with water, and create fertilizer straight on the farm. The concept, on the surface, seems suspiciously rosy — on-demand nitrogen, in a form plants can use, at just the cost of electricity (and the initial price of the machine used to make it). But early adopters have told Offrange that it genuinely works…
… PAW uses electricity to transform air into plasma — the fourth state of matter (besides gases, solids, and liquids), which typically forms at high temperatures. When the plasma comes into contact with water, it encourages chemical reactions that form nitrates — the type of nitrogen that plants need. Though this process was actually invented in 1903, even before Haber-Bosch, it required so much energy that it never achieved widespread use.
But in recent years, those energy needs have gone down thanks to the development of “cold plasma” technology, which operates at less than 60 degrees Fahrenheit. It’s also used for medical sterilization and food safety, and over the last decade researchers have worked to develop new ways to apply it for agricultural production…
More at: “Pulling Nitrogen From the Air” from @dkruzman.bsky.social.
* Nikola Tesla (who, around 1900, imagined and experimented with something like the Birkeland–Eyde-based plasma process described above)
###
As we count on creativity, we might send healthy birthday greetings to a man who explained one of the central ways in which we depend on the food that we eat, William Cumming Rose; he was born on this date in 1887. A biochemist, he researched amino acids, discovered threonine, and established the importance of the nine essential amino acids in human nutrition (that’s to say, the amino acids that our bodies cannot synthesize and that we must consume in our food). He received the National Medal of Science in 1966.
“For what man in the natural state or course of thinking did ever conceive it in his power to reduce the notions of all mankind exactly to the same length, and breadth, and height of his own? Yet this is the first humble and civil design of all innovators in the empire of reason.”*…
A “theory of everything” (a Grand Unified Theory on steriods)– a (still hypothetical) coherent theoretical framework of physics containing and explaining all physical principles– is the holy grail of physicists. Natalie Wolchover checks in on the most recent front-runner in the hunt…
Fifty-eight years after it first appeared, string theory remains the most popular candidate for the “theory of everything,” the unified mathematical framework for all matter and forces in the universe. This is much to the chagrin of its rather vocal critics. “String theory is not dead; it’s undead and now walks around like a zombie eating people’s brains,” the former physicist Sabine Hossenfelder said on her popular YouTube channel in 2024.
String theory is a “failure,” the mathematical physicist and blogger Peter Woit often says. His complaint is not that string theory is wrong — it’s that it’s “not even wrong,” as he titled a 2006 book. The theory says that, on scales of billionths of trillionths of trillionths of a centimeter, extra curled-up spatial dimensions reveal themselves and particles resolve into extended objects — strands and loops of energy — rather than points. But this alleged substructure is too small to detect, probably ever. The prediction is untestable.
A further problem is that uncountably many different configurations of dimensions and strings are permitted at those tiny scales; the theory can give rise to a limitless variety of universes. Amid this vast landscape of solutions, no one can hope to find a precise microscopic configuration that undergirds our particular macroscopic world.
These issues are profound indeed. Yet in my experience, the typical high-energy theorist in a prestigious university physics department still thinks string theory has a good chance of being correct, at least in part. The field has become siloed between those who deem it worth studying and those who don’t.
Recently, a new angle of attack has opened up. An approach called bootstrapping has allowed physicists to calculate that, under various starting assumptions about the universe, a key equation from string theory naturally follows. For some experts, these findings support the notion of “string uniqueness,” the idea that it is the only mathematically consistent quantum description of gravity and everything else.
Responding to one bootstrap paper on her YouTube channel, mere weeks after the “undead” comment, Hossenfelder said it was “string theorists do[ing] something sensible for once.” She added, “I’d say this paper strengthens the argument for string theory.”
Not everyone agrees, but the findings are reviving an important question. “This question of ‘Does string theory describe the world?’ has just been so taboo,” said Cliff Cheung, a physicist at the California Institute of Technology and an author of the paper discussed by Hossenfelder. Now, “people are actually thinking about it for the first time in decades.”
Getting wind of this work, I wanted to drill down on the logic and examine how the string hypothesis is faring these days…
And so she does: “Are Strings Still Our Best Hope for a Theory of Everything?” from @nattyover.bsky.social in @quantamagazine.bsky.social. Eminently worth reading in full.
Compare/contrast with: “Where Some See Strings, She Sees a Space-Time Made of Fractals.”
* Jonathan Swift, A Tale of a Tub
###
As we grapple with Godel, we might spare a thought for Hermann Rorschach; he died on this date in 1922. A psychiatrist and psychoanalyst, his education in art helped to spur the development of a set of inkblots that were used experimentally to measure various unconscious parts of the subject’s personality. Rorschach knew the human tendency to project interpretations and feelings onto ambiguous stimuli and believed that the subjective responses of his subjects enabled him to distinguish among them on the basis of their perceptive abilities, intelligence, and emotional characteristics. His method has come to be known as the Rorschach test, iterations of which have continued to be used over the years to help identify personality, psychotic, and neurological disorders.
Perhaps his insight that we humans tend “to project interpretations and feelings onto ambiguous stimuli” can inform our understanding of physicists trying to construct mental/conceptual models of our reality, which they’ve been doing for a very long time, and of the limitations of that quest.










You must be logged in to post a comment.