(Roughly) Daily

Posts Tagged ‘quantum computing

“The ‘paradox’ is only a conflict between reality and your feeling of what reality ‘ought to be’”*…

John Stewart Bell (1928-1990), the Northern Irish physicist whose work sparked a quiet revolution in quantum physics

Elegant experiments with entangled light have laid bare a profound mystery at the heart of reality. Daniel Garisto explains the importance of the work done by this year’s Nobel laureates in Physics…

One of the more unsettling discoveries in the past half century is that the universe is not locally real. “Real,” meaning that objects have definite properties independent of observation—an apple can be red even when no one is looking; “local” means objects can only be influenced by their surroundings, and that any influence cannot travel faster than light. Investigations at the frontiers of quantum physics have found that these things cannot both be true. Instead, the evidence shows objects are not influenced solely by their surroundings and they may also lack definite properties prior to measurement. As Albert Einstein famously bemoaned to a friend, “Do you really believe the moon is not there when you are not looking at it?”

This is, of course, deeply contrary to our everyday experiences. To paraphrase Douglas Adams, the demise of local realism has made a lot of people very angry and been widely regarded as a bad move.

Blame for this achievement has now been laid squarely on the shoulders of three physicists: John Clauser, Alain Aspect and Anton Zeilinger. They equally split the 2022 Nobel Prize in Physics “for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science.” (“Bell inequalities” refers to the pioneering work of the Northern Irish physicist John Stewart Bell, who laid the foundations for this year’s Physics Nobel in the early 1960s.) Colleagues agreed that the trio had it coming, deserving this reckoning for overthrowing reality as we know it. “It is fantastic news. It was long overdue,” says Sandu Popescu, a quantum physicist at the University of Bristol. “Without any doubt, the prize is well-deserved.”

“The experiments beginning with the earliest one of Clauser and continuing along, show that this stuff isn’t just philosophical, it’s real—and like other real things, potentially useful,” says Charles Bennett, an eminent quantum researcher at IBM…

Quantum foundations’ journey from fringe to favor was a long one. From about 1940 until as late as 1990, the topic was often treated as philosophy at best and crackpottery at worst. Many scientific journals refused to publish papers in quantum foundations, and academic positions indulging such investigations were nearly impossible to come by…

Today, quantum information science is among the most vibrant and impactful subfields in all of physics. It links Einstein’s general theory of relativity with quantum mechanics via the still-mysterious behavior of black holes. It dictates the design and function of quantum sensors, which are increasingly being used to study everything from earthquakes to dark matter. And it clarifies the often-confusing nature of quantum entanglement, a phenomenon that is pivotal to modern materials science and that lies at the heart of quantum computing…

Eminently worth reading in full: “The Universe Is Not Locally Real, and the Physics Nobel Prize Winners Proved It,” from @dangaristo in @sciam.

Apposite: entangled particles and wormholes could be manifestations of the same phenomenon, and resolve paradoxes like information escaping a black hole: “Black Holes May Hide a Mind-Bending Secret About Our Universe.” 

Richard Feynman

###

As we rethink reality, we might spare a thought for Walter Brattain; he died on this date in 1987. A physicist (at Bell Labs at the time), he worked with John Bardeen and William Shockley to invent the point-contact transistor in 1947, the birth of the semiconductor– work for which the trio shared the Nobel Prize in Physics in 1956.

At college, Brattain said, he majored in physics and math because they were the only subjects he was good at. He became a solid physicist with a good understanding of theory, but his strength was in physically constructing experiments. Working with the ideas of Shockley and Bardeen, Brattain’s hands built the first transistor. Shortly, the transistor replaced the bulkier vacuum tube for many uses and was the forerunner of microminiature electronic parts.

As semiconductor technology has advanced, it has begun to incorporate quantum effects.

source

“If you are confused by the underlying principles of quantum technology – you get it!”*…

A tour through the map above– a helpful primer on the origins, development, and possible futures of quantum computing…

From Dominic Walliman (@DominicWalliman) on @DomainOfScience.

* Kevin Coleman

###

As we embrace uncertainty, we might spare a thought for

Alan Turing; he died on this date in 1954. A British mathematician, he was a foundational computer science pioneer (inventor of the Turing Machine, creator of the “Turing Test” (perhaps to b made more relevant by quantum computing :), and inspiration for “The Turing Award“) and cryptographer (leading member of the team that cracked the Enigma code during WWII).  

source

“If you are not completely confused by quantum mechanics, you do not understand it”*…

 

uchicagoscie

 

If we can harness it, quantum technology promises fantastic new possibilities. But first, scientists need to coax quantum systems to stay yoked for longer than a few millionths of a second.

A team of scientists at the University of Chicago’s Pritzker School of Molecular Engineering announced the discovery of a simple modification that allows quantum systems to stay operational—or “coherent”—10,000 times longer than before. Though the scientists tested their technique on a particular class of quantum systems called solid-state qubits, they think it should be applicable to many other kinds of quantum systems and could thus revolutionize quantum communication, computing and sensing…

Down at the level of atoms, the world operates according to the rules of quantum mechanics—very different from what we see around us in our daily lives. These different rules could translate into technology like virtually unhackable networks or extremely powerful computers; the U.S. Department of Energy released a blueprint for the future quantum internet in an event at UChicago on July 23. But fundamental engineering challenges remain: Quantum states need an extremely quiet, stable space to operate, as they are easily disturbed by background noise coming from vibrations, temperature changes or stray electromagnetic fields.

Thus, scientists try to find ways to keep the system coherent as long as possible…

“This breakthrough lays the groundwork for exciting new avenues of research in quantum science,” said study lead author David Awschalom, the Liew Family Professor in Molecular Engineering, senior scientist at Argonne National Laboratory and director of the Chicago Quantum Exchange. “The broad applicability of this discovery, coupled with a remarkably simple implementation, allows this robust coherence to impact many aspects of quantum engineering. It enables new research opportunities previously thought impractical.”…

Very big news at a very small scale: “Scientists discover way to make quantum states last 10,000 times longer.”

*John Wheeler

###

As we strive for stability, we might send calculated birthday greetings to Brook Taylor; he was born on this date in 1685.  A mathematician, he is best known for his work in describing and understanding oscillation.  In 1708, Taylor produced a solution to the problem of the center of oscillation.  His Methodus incrementorum directa et inversa (“Direct and Indirect Methods of Incrementation,” 1715) introduced what is now called the calculus of finite differences.  Using this, he was the first to express mathematically the movement of a vibrating string on the basis of mechanical principles.  Methodus also contained Taylor’s theorem, later recognized by Joseph Lagrange as the basis of differential calculus.

A gifted artist, Taylor also wrote on the basic principles of perspective, including the first general treatment of the principle of vanishing points.

220px-BTaylor source

 

 

“Moore’s Law is really a thing about human activity, it’s about vision, it’s about what you’re allowed to believe”*…

 

Karen-fungal-computing-2

 

In moments of technological frustration, it helps to remember that a computer is basically a rock. That is its fundamental witchcraft, or ours: for all its processing power, the device that runs your life is just a complex arrangement of minerals animated by electricity and language. Smart rocks. The components are mined from the Earth at great cost, and they eventually return to the Earth, however poisoned. This rock-and-metal paradigm has mostly served us well. The miniaturization of metallic components onto wafers of silicon — an empirical trend we call Moore’s Law — has defined the last half-century of life on Earth, giving us wristwatch computers, pocket-sized satellites and enough raw computational power to model the climate, discover unknown molecules, and emulate human learning.

But there are limits to what a rock can do. Computer scientists have been predicting the end of Moore’s Law for decades. The cost of fabricating next-generation chips is growing more prohibitive the closer we draw to the physical limits of miniaturization. And there are only so many rocks left. Demand for the high-purity silica sand used to manufacture silicon chips is so high that we’re facing a global, and irreversible, sand shortage; and the supply chain for commonly-used minerals, like tin, tungsten, tantalum, and gold, fuels bloody conflicts all over the world. If we expect 21st century computers to process the ever-growing amounts of data our culture produces — and we expect them to do so sustainably — we will need to reimagine how computers are built. We may even need to reimagine what a computer is to begin with.

It’s tempting to believe that computing paradigms are set in stone, so to speak. But there are already alternatives on the horizon. Quantum computing, for one, would shift us from a realm of binary ones and zeroes to one of qubits, making computers drastically faster than we can currently imagine, and the impossible — like unbreakable cryptography — newly possible. Still further off are computer architectures rebuilt around a novel electronic component called a memristor. Speculatively proposed by the physicist Leon Chua in 1971, first proven to exist in 2008, a memristor is a resistor with memory, which makes it capable of retaining data without power. A computer built around memristors could turn off and on like a light switch. It wouldn’t require the conductive layer of silicon necessary for traditional resistors. This would open computing to new substrates — the possibility, even, of integrating computers into atomically thin nano-materials. But these are architectural changes, not material ones.

For material changes, we must look farther afield, to an organism that occurs naturally only in the most fleeting of places. We need to glimpse into the loamy rot of a felled tree in the woods of the Pacific Northwest, or examine the glistening walls of a damp cave. That’s where we may just find the answer to computing’s intractable rock problem: down there, among the slime molds…

It’s time to reimagine what a computer could be: “Beyond Smart Rocks.”

(TotH to Patrick Tanguay.)

* “Moore’s Law is really a thing about human activity, it’s about vision, it’s about what you’re allowed to believe. Because people are really limited by their beliefs, they limit themselves by what they allow themselves to believe about what is possible.”  – Carver Mead

###

As we celebrate slime, we might send fantastically far-sighted birthday greetings to Hugo Gernsback, a Luxemborgian-American inventor, broadcast pioneer, writer, and publisher; he was born on this date in 1884.

Gernsback held 80 patents at the time of his death; he founded radio station WRNY, was involved in the first television broadcasts, and is considered a pioneer in amateur radio.  But it was as a writer and publisher that he probably left his most lasting mark:  In 1926, as owner/publisher of the magazine Modern Electrics, he filled a blank spot in his publication by dashing off the first chapter of a series called “Ralph 124C 41+.” The twelve installments of “Ralph” were filled with inventions unknown in 1926, including “television” (Gernsback is credited with introducing the word), fluorescent lighting, juke boxes, solar energy, television, microfilm, vending machines, and the device we now call radar.

The “Ralph” series was an astounding success with readers; and later that year Gernsback founded the first magazine devoted to science fiction, Amazing Stories.  Believing that the perfect sci-fi story is “75 percent literature interwoven with 25 percent science,” he coined the term “science fiction.”

Gernsback was a “careful” businessman, who was tight with the fees that he paid his writers– so tight that H. P. Lovecraft and Clark Ashton Smith referred to him as “Hugo the Rat.”

Still, his contributions to the genre as publisher were so significant that, along with H.G. Wells and Jules Verne, he is sometimes called “The Father of Science Fiction”; in his honor, the annual Science Fiction Achievement awards are called the “Hugos.”

(Coincidentally, today is also the birthday– in 1906– of Philo T. Farnsworth, the man who actually did invent television… and was thus the inspiration for the name “Philco.”)

[UPDATE- With thanks to friend MK for the catch:  your correspondent was relying on an apocryphal tale in attributing the Philco brand name to to Philo Farnsworth.  Farsworth did work with the company, and helped them enter the television business.  But the Philco trademark dates back to 1919– pre-television days– as a label for what was then the Philadelphia Storage Battery Company.]

Gernsback, wearing one of his inventions, TV Glasses

source

 

 

“A classical computation is like a solo voice—one line of pure tones succeeding each other. A quantum computation is like a symphony—many lines of tones interfering with one another.”*…

 

abstractions-a-419

 

Quantum computers will never fully replace “classical” ones like the device you’re reading this article on. They won’t run web browsers, help with your taxes, or stream the latest video from Netflix.

What they will do—what’s long been hoped for, at least—will be to offer a fundamentally different way of performing certain calculations. They’ll be able to solve problems that would take a fast classical computer billions of years to perform. They’ll enable the simulation of complex quantum systems such as biological molecules, or offer a way to factor incredibly large numbers, thereby breaking long-standing forms of encryption.

The threshold where quantum computers cross from being interesting research projects to doing things that no classical computer can do is called “quantum supremacy.” Many people believe that Google’s quantum computing project will achieve it later this year…

Researchers are getting close to building a quantum computer that can perform tasks a classical computer can’t. Here’s what the milestone will mean: “Quantum Supremacy Is Coming: Here’s What You Should Know.”

* Seth Lloyd, Programming the Universe: A Quantum Computer Scientist Takes on the Cosmos

###

As we get weird, we might recall that it was on this date in 2012 that Ohioan Beth Johnson attempted to break a record that has been set in on this same date 1999 by a group of English college students– for the largest working yoyo in the world.  The British yoyo was 10 feet in diameter; hers, 11 feet, 9 inches.  (It weighed 4,620 lbs.)  Her attempt on this date failed, as did another.  But finally, in September, 2012, she was able successfully to deploy it from a crane in Cincinnati… and earn her way into the Guinness Book of World Records

ss-140909-guinness-08.fit-660w

Beth Johnson and her record-setting creation

source

 

%d bloggers like this: