(Roughly) Daily

Posts Tagged ‘computing

“Those who can imagine anything, can create the impossible”*…

As Charlie Wood explains, physicists are building neural networks out of vibrations, voltages and lasers, arguing that the future of computing lies in exploiting the universe’s complex physical behaviors…

… When it comes to conventional machine learning, computer scientists have discovered that bigger is better. Stuffing a neural network with more artificial neurons — nodes that store numerical values — improves its ability to tell a dachshund from a Dalmatian, or to succeed at myriad other pattern recognition tasks. Truly tremendous neural networks can pull off unnervingly human undertakings like composing essays and creating illustrations. With more computational muscle, even grander feats may become possible. This potential has motivated a multitude of efforts to develop more powerful and efficient methods of computation.

[Cornell’s Peter McMahon] and a band of like-minded physicists champion an unorthodox approach: Get the universe to crunch the numbers for us. “Many physical systems can naturally do some computation way more efficiently or faster than a computer can,” McMahon said. He cites wind tunnels: When engineers design a plane, they might digitize the blueprints and spend hours on a supercomputer simulating how air flows around the wings. Or they can stick the vehicle in a wind tunnel and see if it flies. From a computational perspective, the wind tunnel instantly “calculates” how wings interact with air.

A wind tunnel is a single-minded machine; it simulates aerodynamics. Researchers like McMahon are after an apparatus that can learn to do anything — a system that can adapt its behavior through trial and error to acquire any new ability, such as classifying handwritten digits or distinguishing one spoken vowel from another. Recent work has shown that physical systems like waves of light, networks of superconductors and branching streams of electrons can all learn.

“We are reinventing not just the hardware,” said Benjamin Scellier, a mathematician at the Swiss Federal Institute of Technology Zurich in Switzerland who helped design a new physical learning algorithm, but “also the whole computing paradigm.”…

Computing at the largest scale? “How to Make the Universe Think for Us,” from @walkingthedot in @QuantaMagazine.

Alan Turing

###

As we think big, we might send well-connected birthday greetings to Leonard Kleinrock; he was born on this date in 1934. A computer scientist, he made several foundational contributions the field, in particular to the theoretical foundations of data communication in computer networking. Perhaps most notably, he was central to the development of ARPANET (which essentially grew up to be the internet); his graduate students at UCLA were instrumental in developing the communication protocols for internetworking that made that possible.

Kleinrock at a meeting of the members of the Internet Hall of Fame

source

“If you are confused by the underlying principles of quantum technology – you get it!”*…

A tour through the map above– a helpful primer on the origins, development, and possible futures of quantum computing…

From Dominic Walliman (@DominicWalliman) on @DomainOfScience.

* Kevin Coleman

###

As we embrace uncertainty, we might spare a thought for

Alan Turing; he died on this date in 1954. A British mathematician, he was a foundational computer science pioneer (inventor of the Turing Machine, creator of the “Turing Test” (perhaps to b made more relevant by quantum computing :), and inspiration for “The Turing Award“) and cryptographer (leading member of the team that cracked the Enigma code during WWII).  

source

“Prediction and explanation are exactly symmetrical”*…

From a December, 1969 episode of the BBC series Tomorrow’s World, an eerily-prescient look at the computerized future of banking…

The emergence of the debit card, the impact on back-office jobs, the receding importance of branch banks… they nailed it.

TotH to Benedict Evans (@benedictevans)

* “Prediction and explanation are exactly symmetrical. Explanations are, in effect, predictions about what has happened; predictions are explanations about what’s going to happen.” – John Searle

###

As we find our ways into the future, we might recall that it was on this date in 1878 that the modern music business was effectively born: Thomas Edison was awarded U.S. Patent No. 200,521 for his invention, the phonograph.

Thomas Edison with his phonograph, photographed by Mathew Brady in Washington, April 1878

source

Written by (Roughly) Daily

February 19, 2022 at 1:00 am

“Visualization gives you answers to questions you didn’t know you had”*…

Reckoning before writing: Mesopotamian Clay Tokens

Physical representations of data have existed for thousands of years. The List of Physical Visualizations (and the accompanying Gallery) collect illustrative examples, e.g…

5500 BC – Mesopotamian Clay Tokens

The earliest data visualizations were likely physical: built by arranging stones or pebbles, and later, clay tokens. According to an eminent archaeologist (Schmandt-Besserat, 1999):

“Whereas words consist of immaterial sounds, the tokens were concrete, solid, tangible artifacts, which could be handled, arranged and rearranged at will. For instance, the tokens could be ordered in special columns according to types of merchandise, entries and expenditures; donors or recipients. The token system thus encouraged manipulating data by abstracting all possible variables. (Harth 1983. 19) […] No doubt patterning, the presentation of data in a particular configuration, was developed to highlight special items (Luria 1976. 20).”

Clay tokens suggest that physical objects were used to externalize information, support visual thinking and enhance cognition way before paper and writing were invented…

There are 370 entries (so far). Browse them at List of Physical Visualizations (@dataphys)

Ben Schneiderman

###

As we celebrate the concrete, we might carefully-calculated birthday greetings to Rolf Landauer; he was born on this date in 1927. A physicist, he made a number important contributions in a range of areas: the thermodynamics of information processing, condensed matter physics, and the conductivity of disordered media.

He is probably best remembered for “Landauer’s Principle,” which described the energy used during a computer’s operation. Whenever the machine is resetting for another computation, bits are flushed from the computer’s memory, and in that electronic operation, a certain amount of energy is lost (a simple logical consequence of the second law of thermodynamics). Thus, when information is erased, there is an inevitable “thermodynamic cost of forgetting,” which governs the development of more energy-efficient computers. The maximum entropy of a bounded physical system is finite– so while most engineers dealt with practical limitations of compacting ever more circuitry onto tiny chips, Landauer considered the theoretical limit: if technology improved indefinitely, how soon will it run into the insuperable barriers set by nature?

A so-called logically reversible computation, in which no information is erased, may in principle be carried out without releasing any heat. This has led to considerable interest in the study of reversible computing. Indeed, without reversible computing, increases in the number of computations per joule of energy dissipated must eventually come to a halt. If Koomey‘s law continues to hold, the limit implied by Landauer’s principle would be reached around the year 2050.

source

“I visualize a time when we will be to robots what dogs are to humans. And I am rooting for the machines.”*…

Claude Shannon with his creation, Theseus the maze-solving mouse, an early illustration of machine learning and a follow-on project to the work described below

Readers will know of your correspondent’s fascination with the remarkable Claude Shannon (see here and here), remembered as “the father of information theory,” but seminally involved in so much more. In a recent piece in IEEE Spectrum, the redoubtable Rodney Brooks argues that we should add another credit to Shannon’s list…

Among the great engineers of the 20th century, who contributed the most to our 21st-century technologies? I say: Claude Shannon.

Shannon is best known for establishing the field of information theory. In a 1948 paper, one of the greatest in the history of engineering, he came up with a way of measuring the information content of a signal and calculating the maximum rate at which information could be reliably transmitted over any sort of communication channel. The article, titled “A Mathematical Theory of Communication,” describes the basis for all modern communications, including the wireless Internet on your smartphone and even an analog voice signal on a twisted-pair telephone landline. In 1966, the IEEE gave him its highest award, the Medal of Honor, for that work.

If information theory had been Shannon’s only accomplishment, it would have been enough to secure his place in the pantheon. But he did a lot more…

In 1950 Shannon published an article in Scientific American and also a research paper describing how to program a computer to play chess. He went into detail on how to design a program for an actual computer…

Shannon did all this at a time when there were fewer than 10 computers in the world. And they were all being used for numerical calculations. He began his research paper by speculating on all sorts of things that computers might be programmed to do beyond numerical calculations, including designing relay and switching circuits, designing electronic filters for communications, translating between human languages, and making logical deductions. Computers do all these things today…

The “father of information theory” also paved the way for AI: “How Claude Shannon Helped Kick-start Machine Learning,” from @rodneyabrooks in @IEEESpectrum.

* Claude Shannon (who may or may not have been kidding…)

###

As we ponder possibility, we might send uncertain birthday greetings to Werner Karl Heisenberg; he was born on this date in 1901.  A theoretical physicist, he made important contributions to the theories of the hydrodynamics of turbulent flows, the atomic nucleus, ferromagnetism, superconductivity, cosmic rays, and subatomic particles.  But he is most widely remembered as a pioneer of quantum mechanics and author of what’s become known as the Heisenberg Uncertainty Principle.  Heisenberg was awarded the Nobel Prize in Physics for 1932 “for the creation of quantum mechanics.”

During World War II, Heisenberg was part of the team attempting to create an atomic bomb for Germany– for which he was arrested and detained by the Allies at the end of the conflict.  He was returned to Germany, where he became director of the Kaiser Wilhelm Institute for Physics, which soon thereafter was renamed the Max Planck Institute for Physics. He later served as president of the German Research Council, chairman of the Commission for Atomic Physics, chairman of the Nuclear Physics Working Group, and president of the Alexander von Humboldt Foundation.

Some things are so serious that one can only joke about them

Werner Heisenberg

source

%d bloggers like this: