(Roughly) Daily

Posts Tagged ‘information theory

“Information is a difference that makes a difference”*…

 

Shannon information

 

Information was something guessed at rather than spoken of, something implied in a dozen ways before it was finally tied down. Information was a presence offstage. It was there in the studies of the physiologist Hermann von Helmholtz, who, electrifying frog muscles, first timed the speed of messages in animal nerves just as Thomson was timing the speed of messages in wires. It was there in the work of physicists like Rudolf Clausius and Ludwig Boltzmann, who were pioneering ways to quantify disorder—entropy—little suspecting that information might one day be quantified in the same way. Above all, information was in the networks that descended in part from the first attempt to bridge the Atlantic with underwater cables. In the attack on the practical engineering problems of connecting Points A and B—what is the smallest number of wires we need to string up to handle a day’s load of messages? how do we encrypt a top-secret telephone call?—the properties of information itself, in general, were gradually uncovered.

By the time of Claude Shannon’s childhood, the world’s communications networks were no longer passive wires acting as conduits for electricity, a kind of electron plumbing. They were continent-spanning machines, arguably the most complex machines in existence. Vacuum-tube amplifiers strung along the telephone lines added power to voice signals that would have otherwise attenuated and died out on their thousand-mile journeys. A year before Shannon was born, in fact, Bell and Watson inaugurated the transcontinental phone line by reenacting their first call, this time with Bell in New York and Watson in San Francisco. By the time Shannon was in elementary school, feedback systems managed the phone network’s amplifiers automatically, holding the voice signals stable and silencing the “howling” or “singing” noises that plagued early phone calls, even as the seasons turned and the weather changed around the sensitive wires that carried them. Each year that Shannon placed a call, he was less likely to speak to a human operator and more likely to have his call placed by machine, by one of the automated switchboards that Bell Labs grandly called a “mechanical brain.” In the process of assembling and refining these sprawling machines, Shannon’s generation of scientists came to understand information in much the same way that an earlier generation of scientists came to understand heat in the process of building steam engines.

It was Shannon who made the final synthesis, who defined the concept of information and effectively solved the problem of noise. It was Shannon who was credited with gathering the threads into a new science…

The story of Claude Shannon, his colorful life–  and the birth of the Information Age: “How Information Got Re-Invented.”

* Gregory Bateson

###

As we separate the signal from the noise, we might send communicative birthday greetings to the subject of today’s main post, Claude Elwood Shannon; he was born on this date in 1916.  A mathematician, electrical engineer, and cryptographer, he is, for reasons explained in the article featured above, known as “the father of information theory.”  But he is also remembered for his contributions to digital circuit design theory and for his cryptanalysis work during World War II, both as a codebreaker and as a designer of secure communications systems.

220px-ClaudeShannon_MFO3807 source

 

“Intelligence is really a kind of taste: taste in ideas.”*…

 

Two researchers have proposed a new way of thinking about intelligence: one in which intelligence is related to entropy– one in which intelligence is a fundamentally thermodynamic process…

Alexander Wissner-Gross, a physicist at Harvard University and the Massachusetts Institute of Technology, and Cameron Freer, a mathematician at the University of Hawaii at Manoa, developed an equation that they say describes many intelligent or cognitive behaviors, such as upright walking and tool use.

The researchers suggest that intelligent behavior stems from the impulse to seize control of future events in the environment. This is the exact opposite of the classic science-fiction scenario in which computers or robots become intelligent, then set their sights on taking over the world…

Read the full story at the always-illuminating 3 Quarks Daily; read the paper in the journal Physical Review Letters.  

* Susan Sontag

###

As we reckon that maybe “hot air” isn’t so bad after all, we might send intelligent birthday greetings to Claude Elwood Shannon; he was born on this date in 1916.  A  mathematician, electronic engineer, and cryptographer, Shannon is best remembered as the “father of information theory” (for his 1948 paper that launched the field); but he is also credited with founding both digital computer and digital circuit design theory in 1937, when, as a 21-year-old master’s degree student at MIT, he wrote his thesis demonstrating that electrical applications of boolean algebra could construct and resolve any logical, numerical relationship– widely considered to be the most important master’s thesis of all time.

 source

 

 

%d bloggers like this: