(Roughly) Daily

Posts Tagged ‘information theory

“Two obsessions are the hallmarks of Nature’s artistic style: Symmetry- a love of harmony, balance, and proportion [and] Economy- satisfaction in producing an abundance of effects from very limited means”*…

Life is built of symmetrical structures. But why? Sachin Rawat explores…

Life comes in a variety of shapes and sizes, but all organisms generally have at least one feature in common: symmetry.

Notice how your left half mirrors the right or the radial arrangement of the petals of a flower or a starfish’s arms. Such symmetry persists even at the microscopic level, too, in the near-spherical shape of many microbes or in the identical sub-units of different proteins.

The abundance of symmetry in biological forms begs the question of whether symmetric designs provide an advantage. Any engineer would tell you that they do. Symmetry is crucial to designing modular, robust parts that can be combined together to create more complex structures. Think of Lego blocks and how they can be assembled easily to create just about anything.

However, unlike an engineer, evolution doesn’t have the gift of foresight. Some biologists suggest that symmetry must provide an immediate selective advantage. But any adaptive advantage that symmetry may provide isn’t by itself sufficient to explain its pervasiveness in biology across scales both great and small.

Now, based on insights from algorithmic information theory, a study published in Proceedings of the Natural Academy of Sciences suggests that there could be a non-adaptive explanation…

Symmetrical objects are less complex than non-symmetrical ones. Perhaps evolution acts as an algorithm with a bias toward simplicity: “Simple is beautiful: Why evolution repeatedly selects symmetrical structures,” from @sachinxr in @bigthink.

Frank Wilczek (@FrankWilczek)

###

As we celebrate symmetry, we might recall (speaking of symmetry) that it was on this date in 1963 that the Equal Pay Act of 1963 was signed into law by president John F. Kennedy. Aimed at abolishing wage disparity based on sex, it provided that “[n]o employer having employees subject to any provisions of this section [section 206 of title 29 of the United States Code] shall discriminate, within any establishment in which such employees are employed, between employees on the basis of sex by paying wages to employees in such establishment at a rate less than the rate at which he pays wages to employees of the opposite sex in such establishment for equal work on jobs[,] the performance of which requires equal skill, effort, and responsibility, and which are performed under similar working conditions, except where such payment is made pursuant to (i) a seniority system; (ii) a merit system; (iii) a system which measures earnings by quantity or quality of production; or (iv) a differential based on any other factor other than sex […].

Those exceptions (and lax enforcement) have meant that, 60 years later, women in the U.S. are still paid less than men in comparable positions in nearly all occupations, earning on average 83 cents for every dollar earned by a man in a similar role.

source

“One of the most singular characteristics of the art of deciphering is the strong conviction possessed by every person, even moderately acquainted with it, that he is able to construct a cipher which nobody else can decipher.”*…

And yet, for centuries no one has succeeded. Now, as Erica Klarreich reports, cryptographers want to know which of five possible worlds we inhabit, which will reveal whether truly secure cryptography is even possible…

Many computer scientists focus on overcoming hard computational problems. But there’s one area of computer science in which hardness is an asset: cryptography, where you want hard obstacles between your adversaries and your secrets.

Unfortunately, we don’t know whether secure cryptography truly exists. Over millennia, people have created ciphers that seemed unbreakable right until they were broken. Today, our internet transactions and state secrets are guarded by encryption methods that seem secure but could conceivably fail at any moment.

To create a truly secure (and permanent) encryption method, we need a computational problem that’s hard enough to create a provably insurmountable barrier for adversaries. We know of many computational problems that seem hard, but maybe we just haven’t been clever enough to solve them. Or maybe some of them are hard, but their hardness isn’t of a kind that lends itself to secure encryption. Fundamentally, cryptographers wonder: Is there enough hardness in the universe to make cryptography possible?

In 1995, Russell Impagliazzo of the University of California, San Diego broke down the question of hardness into a set of sub-questions that computer scientists could tackle one piece at a time. To summarize the state of knowledge in this area, he described five possible worlds — fancifully named Algorithmica, Heuristica, Pessiland, Minicrypt and Cryptomania — with ascending levels of hardness and cryptographic possibility. Any of these could be the world we live in…

Explore each of them– and their implications for secure encryption– at “Which Computational Universe Do We Live In?” from @EricaKlarreich in @QuantaMagazine.

Charles Babbage

###

As we contemplate codes, we might we might send communicative birthday greetings to a frequentlyfeatured hero of your correspondent, Claude Elwood Shannon; he was born on this date in 1916.  A mathematician, electrical engineer– and cryptographer– he is known as “the father of information theory.”  But he is also remembered for his contributions to digital circuit design theory and for his cryptanalysis work during World War II, both as a codebreaker and as a designer of secure communications systems.

220px-ClaudeShannon_MFO3807

 source

“I visualize a time when we will be to robots what dogs are to humans. And I am rooting for the machines.”*…

Claude Shannon with his creation, Theseus the maze-solving mouse, an early illustration of machine learning and a follow-on project to the work described below

Readers will know of your correspondent’s fascination with the remarkable Claude Shannon (see here and here), remembered as “the father of information theory,” but seminally involved in so much more. In a recent piece in IEEE Spectrum, the redoubtable Rodney Brooks argues that we should add another credit to Shannon’s list…

Among the great engineers of the 20th century, who contributed the most to our 21st-century technologies? I say: Claude Shannon.

Shannon is best known for establishing the field of information theory. In a 1948 paper, one of the greatest in the history of engineering, he came up with a way of measuring the information content of a signal and calculating the maximum rate at which information could be reliably transmitted over any sort of communication channel. The article, titled “A Mathematical Theory of Communication,” describes the basis for all modern communications, including the wireless Internet on your smartphone and even an analog voice signal on a twisted-pair telephone landline. In 1966, the IEEE gave him its highest award, the Medal of Honor, for that work.

If information theory had been Shannon’s only accomplishment, it would have been enough to secure his place in the pantheon. But he did a lot more…

In 1950 Shannon published an article in Scientific American and also a research paper describing how to program a computer to play chess. He went into detail on how to design a program for an actual computer…

Shannon did all this at a time when there were fewer than 10 computers in the world. And they were all being used for numerical calculations. He began his research paper by speculating on all sorts of things that computers might be programmed to do beyond numerical calculations, including designing relay and switching circuits, designing electronic filters for communications, translating between human languages, and making logical deductions. Computers do all these things today…

The “father of information theory” also paved the way for AI: “How Claude Shannon Helped Kick-start Machine Learning,” from @rodneyabrooks in @IEEESpectrum.

* Claude Shannon (who may or may not have been kidding…)

###

As we ponder possibility, we might send uncertain birthday greetings to Werner Karl Heisenberg; he was born on this date in 1901.  A theoretical physicist, he made important contributions to the theories of the hydrodynamics of turbulent flows, the atomic nucleus, ferromagnetism, superconductivity, cosmic rays, and subatomic particles.  But he is most widely remembered as a pioneer of quantum mechanics and author of what’s become known as the Heisenberg Uncertainty Principle.  Heisenberg was awarded the Nobel Prize in Physics for 1932 “for the creation of quantum mechanics.”

During World War II, Heisenberg was part of the team attempting to create an atomic bomb for Germany– for which he was arrested and detained by the Allies at the end of the conflict.  He was returned to Germany, where he became director of the Kaiser Wilhelm Institute for Physics, which soon thereafter was renamed the Max Planck Institute for Physics. He later served as president of the German Research Council, chairman of the Commission for Atomic Physics, chairman of the Nuclear Physics Working Group, and president of the Alexander von Humboldt Foundation.

Some things are so serious that one can only joke about them

Werner Heisenberg

source

“No structure, even an artificial one, enjoys the process of entropy. It is the ultimate fate of everything, and everything resists it.”*…

A 19th-century thought experiment that motivates physicists– and information scientists– still…

The universe bets on disorder. Imagine, for example, dropping a thimbleful of red dye into a swimming pool. All of those dye molecules are going to slowly spread throughout the water.

Physicists quantify this tendency to spread by counting the number of possible ways the dye molecules can be arranged. There’s one possible state where the molecules are crowded into the thimble. There’s another where, say, the molecules settle in a tidy clump at the pool’s bottom. But there are uncountable billions of permutations where the molecules spread out in different ways throughout the water. If the universe chooses from all the possible states at random, you can bet that it’s going to end up with one of the vast set of disordered possibilities.

Seen in this way, the inexorable rise in entropy, or disorder, as quantified by the second law of thermodynamics, takes on an almost mathematical certainty. So of course physicists are constantly trying to break it.

One almost did. A thought experiment devised by the Scottish physicist James Clerk Maxwell in 1867 stumped scientists for 115 years. And even after a solution was found, physicists have continued to use “Maxwell’s demon” to push the laws of the universe to their limits…

A thorny thought experiment has been turned into a real experiment—one that physicists use to probe the physics of information: “How Maxwell’s Demon Continues to Startle Scientists,” from Jonathan O’Callaghan (@Astro_Jonny)

* Philip K. Dick

###

As we reconsider the random, we might send carefully-calculated birthday greetings to Félix Édouard Justin Émile Borel; he was born on this date in 1871. A mathematician (and politician, who served as French Minister of the Navy), he is remembered for his foundational work in measure theory and probability. He published a number of research papers on game theory and was the first to define games of strategy.

But Borel may be best remembered for a thought experiment he introduced in one of his books, proposing that a monkey hitting keys at random on a typewriter keyboard will – with absolute certainty – eventually type every book in France’s Bibliothèque Nationale de France. This is now popularly known as the infinite monkey theorem.

source

“We know the past but cannot control it. We control the future but cannot know it.”*…

Readers will know of your correspondent’s fascination with– and admiration for– Claude Shannon

Within engineering and mathematics circles, Shannon is a revered figure. At 21 [in 1937], he published what’s been called the most important master’s thesis of all time, explaining how binary switches could do logic and laying the foundation for all future digital computers. At the age of 32, he published A Mathematical Theory of Communication, which Scientific American called “the Magna Carta of the information age.” Shannon’s masterwork invented the bit, or the objective measurement of information, and explained how digital codes could allow us to compress and send any message with perfect accuracy.

But Shannon wasn’t just a brilliant theoretical mind — he was a remarkably fun, practical, and inventive one as well. There are plenty of mathematicians and engineers who write great papers. There are fewer who, like Shannon, are also jugglers, unicyclists, gadgeteers, first-rate chess players, codebreakers, expert stock pickers, and amateur poets.

Shannon worked on the top-secret transatlantic phone line connecting FDR and Winston Churchill during World War II and co-built what was arguably the world’s first wearable computer. He learned to fly airplanes and played the jazz clarinet. He rigged up a false wall in his house that could rotate with the press of a button, and he once built a gadget whose only purpose when it was turned on was to open up, release a mechanical hand, and turn itself off. Oh, and he once had a photo spread in Vogue.

Think of him as a cross between Albert Einstein and the Dos Equis guy…

From Jimmy Soni (@jimmyasoni), co-author of A Mind At Play: How Claude Shannon Invented the Information Age: “11 Life Lessons From History’s Most Underrated Genius.”

* Claude Shannon

###

As we learn from the best, we might recall that it was on this date in 1946 that an early beneficiary of Shannon’s thinking, the ENIAC (Electronic Numerical Integrator And Computer), was first demonstrated in operation.  (It was announced to the public the following day.) The first general-purpose computer (Turing-complete, digital, and capable of being programmed and re-programmed to solve different problems), ENIAC was begun in 1943, as part of the U.S’s war effort (as a classified military project known as “Project PX”); it was conceived and designed by John Mauchly and Presper Eckert of the University of Pennsylvania, where it was built.  The finished machine, composed of 17,468 electronic vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints, weighed more than 27 tons and occupied a 30 x 50 foot room– in its time the largest single electronic apparatus in the world.  ENIAC’s basic clock speed was 100,000 cycles per second. Today’s home computers have clock speeds of 1,000,000,000 cycles per second.

 source

%d bloggers like this: