(Roughly) Daily

Posts Tagged ‘cryptanalysis

“One of the most singular characteristics of the art of deciphering is the strong conviction possessed by every person, even moderately acquainted with it, that he is able to construct a cipher which nobody else can decipher.”*…

And yet, for centuries no one has succeeded. Now, as Erica Klarreich reports, cryptographers want to know which of five possible worlds we inhabit, which will reveal whether truly secure cryptography is even possible…

Many computer scientists focus on overcoming hard computational problems. But there’s one area of computer science in which hardness is an asset: cryptography, where you want hard obstacles between your adversaries and your secrets.

Unfortunately, we don’t know whether secure cryptography truly exists. Over millennia, people have created ciphers that seemed unbreakable right until they were broken. Today, our internet transactions and state secrets are guarded by encryption methods that seem secure but could conceivably fail at any moment.

To create a truly secure (and permanent) encryption method, we need a computational problem that’s hard enough to create a provably insurmountable barrier for adversaries. We know of many computational problems that seem hard, but maybe we just haven’t been clever enough to solve them. Or maybe some of them are hard, but their hardness isn’t of a kind that lends itself to secure encryption. Fundamentally, cryptographers wonder: Is there enough hardness in the universe to make cryptography possible?

In 1995, Russell Impagliazzo of the University of California, San Diego broke down the question of hardness into a set of sub-questions that computer scientists could tackle one piece at a time. To summarize the state of knowledge in this area, he described five possible worlds — fancifully named Algorithmica, Heuristica, Pessiland, Minicrypt and Cryptomania — with ascending levels of hardness and cryptographic possibility. Any of these could be the world we live in…

Explore each of them– and their implications for secure encryption– at “Which Computational Universe Do We Live In?” from @EricaKlarreich in @QuantaMagazine.

Charles Babbage


As we contemplate codes, we might we might send communicative birthday greetings to a frequentlyfeatured hero of your correspondent, Claude Elwood Shannon; he was born on this date in 1916.  A mathematician, electrical engineer– and cryptographer– he is known as “the father of information theory.”  But he is also remembered for his contributions to digital circuit design theory and for his cryptanalysis work during World War II, both as a codebreaker and as a designer of secure communications systems.



“Information is a difference that makes a difference”*…


Shannon information


Information was something guessed at rather than spoken of, something implied in a dozen ways before it was finally tied down. Information was a presence offstage. It was there in the studies of the physiologist Hermann von Helmholtz, who, electrifying frog muscles, first timed the speed of messages in animal nerves just as Thomson was timing the speed of messages in wires. It was there in the work of physicists like Rudolf Clausius and Ludwig Boltzmann, who were pioneering ways to quantify disorder—entropy—little suspecting that information might one day be quantified in the same way. Above all, information was in the networks that descended in part from the first attempt to bridge the Atlantic with underwater cables. In the attack on the practical engineering problems of connecting Points A and B—what is the smallest number of wires we need to string up to handle a day’s load of messages? how do we encrypt a top-secret telephone call?—the properties of information itself, in general, were gradually uncovered.

By the time of Claude Shannon’s childhood, the world’s communications networks were no longer passive wires acting as conduits for electricity, a kind of electron plumbing. They were continent-spanning machines, arguably the most complex machines in existence. Vacuum-tube amplifiers strung along the telephone lines added power to voice signals that would have otherwise attenuated and died out on their thousand-mile journeys. A year before Shannon was born, in fact, Bell and Watson inaugurated the transcontinental phone line by reenacting their first call, this time with Bell in New York and Watson in San Francisco. By the time Shannon was in elementary school, feedback systems managed the phone network’s amplifiers automatically, holding the voice signals stable and silencing the “howling” or “singing” noises that plagued early phone calls, even as the seasons turned and the weather changed around the sensitive wires that carried them. Each year that Shannon placed a call, he was less likely to speak to a human operator and more likely to have his call placed by machine, by one of the automated switchboards that Bell Labs grandly called a “mechanical brain.” In the process of assembling and refining these sprawling machines, Shannon’s generation of scientists came to understand information in much the same way that an earlier generation of scientists came to understand heat in the process of building steam engines.

It was Shannon who made the final synthesis, who defined the concept of information and effectively solved the problem of noise. It was Shannon who was credited with gathering the threads into a new science…

The story of Claude Shannon, his colorful life–  and the birth of the Information Age: “How Information Got Re-Invented.”

* Gregory Bateson


As we separate the signal from the noise, we might send communicative birthday greetings to the subject of today’s main post, Claude Elwood Shannon; he was born on this date in 1916.  A mathematician, electrical engineer, and cryptographer, he is, for reasons explained in the article featured above, known as “the father of information theory.”  But he is also remembered for his contributions to digital circuit design theory and for his cryptanalysis work during World War II, both as a codebreaker and as a designer of secure communications systems.

220px-ClaudeShannon_MFO3807 source


Coded references…

Readers will recall the role that Alan Turing and the team at Bletchley Park played in cracking the German Enigma code; some analysts and historians reckon that their work may have shortened World War Two by “not less than two years.”

That code was generated by– and thus cracking it turned on deconstructing and understanding– an Enigma Machine.


Understandably, there were few such machines ever built.  And equally understandable, those that survive are extremely expensive collectables.  But readers need fear not!  Now, thanks to our friends at Instructables, one can convert a “Kid’s Game to an Enigma Machine“:

Step-by-step instructions at Instructables.

Readers might also want to visit Cabinet Magazine‘s wonderful “How to Make Anything Signify Anything,” a profile of American code breaker (and code maker) William Friedman:

By the time he retired from the National Security Agency in 1955, Friedman had served for more than thirty years as his government’s chief cryptographer, and—as leader of the team that broke the Japanese PURPLE code in World War II, co-inventor of the US Army’s best cipher machine, author of the papers that gave the field its mathematical foundations, and coiner of the very term cryptanalysis—he had arguably become the most important code-breaker in modern history.

As we reach for our decoder rings, we might recall that it was on this date in 1884 that the states of Alabama, Georgia, Illinois, Indiana, Kentucky, Mississippi, North Carolina, South Carolina, Tennessee and Virginia, with (at least) 50 tornadoes.  Known as “The Enigma Outbreak,” it did an estimated a total of $3–4 million in tornado damage (in 1884 dollars; plus an unknown amount of flood and other damage), destroying over 10,000 structures.

Photo: © D. Burgess / NOAA (source)



%d bloggers like this: