Posts Tagged ‘Babbage’
“Machines take me by surprise with great frequency”*…
In search of universals in the 17th century, Gottfried Leibniz imagined the calculus ratiocinator, a theoretical logical calculation framework aimed at universal application, that led Norbert Wiener suggested that Leibniz should be considered the patron saint of cybernetics. In the 19th century, Charles Babbage and Ada Lovelace took a pair of whacks at making it real.
Ironically, it was confronting the impossibility of a universal calculator that led to modern computing. In 1936 (the same year that Charlie Chaplin released Modern Times) Alan Turing (following on Godel’s demonstration that mathematics is incomplete and addressing Hilbert‘s “decision problem,” querying the limits of computation) published the (notional) design of a “machine” that elegantly demonstrated those limits– and, as Sheon Han explains, birthed computing as we know it…
… [Hilbert’s] question would lead to a formal definition of computability, one that allowed mathematicians to answer a host of new problems and laid the foundation for theoretical computer science.
The definition came from a 23-year-old grad student named Alan Turing, who in 1936 wrote a seminal paper that not only formalized the concept of computation, but also proved a fundamental question in mathematics and created the intellectual foundation for the invention of the electronic computer. Turing’s great insight was to provide a concrete answer to the computation question in the form of an abstract machine, later named the Turing machine by his doctoral adviser, Alonzo Church. It’s abstract because it doesn’t (and can’t) physically exist as a tangible device. Instead, it’s a conceptual model of computation: If the machine can calculate a function, then the function is computable.
…
With his abstract machine, Turing established a model of computation to answer the Entscheidungsproblem, which formally asks: Given a set of mathematical axioms, is there a mechanical process — a set of instructions, which today we’d call an algorithm — that can always determine whether a given statement is true?…
… in 1936, Church and Turing — using different methods — independently proved that there is no general way of solving every instance of the Entscheidungsproblem. For example, some games, such as John Conway’s Game of Life, are undecidable: No algorithm can determine whether a certain pattern will appear from an initial pattern.
…
Beyond answering these fundamental questions, Turing’s machine also led directly to the development of modern computers, through a variant known as the universal Turing machine. This is a special kind of Turing machine that can simulate any other Turing machine on any input. It can read a description of other Turing machines (their rules and input tapes) and simulate their behaviors on its own input tape, producing the same output that the simulated machine would produce, just as today’s computers can read any program and execute it. In 1945, John von Neumann proposed a computer architecture — called the von Neumann architecture — that made the universal Turing machine concept possible in a real-life machine…
As Turing said, “if a machine is expected to be infallible, it cannot also be intelligent.” On the importance of thought experiments: “The Most Important Machine That Was Never Built,” from @sheonhan in @QuantaMagazine.
* Alan Turing
###
As we sum it up, we might spare a thought for Martin Gardner; he died on this date in 2010. Though not an academic, nor ever a formal student of math or science, he wrote widely and prolifically on both subjects in such popular books as The Ambidextrous Universe and The Relativity Explosion and as the “Mathematical Games” columnist for Scientific American. Indeed, his elegant– and understandable– puzzles delighted professional and amateur readers alike, and helped inspire a generation of young mathematicians.
Gardner’s interests were wide; in addition to the math and science that were his power alley, he studied and wrote on topics that included magic, philosophy, religion, and literature (c.f., especially his work on Lewis Carroll– including the delightful Annotated Alice— and on G.K. Chesterton). And he was a fierce debunker of pseudoscience: a founding member of CSICOP, and contributor of a monthly column (“Notes of a Fringe Watcher,” from 1983 to 2002) in Skeptical Inquirer, that organization’s monthly magazine.

“We may say most aptly that the Analytical Engine weaves algebraical patterns just as the Jacquard loom weaves flowers and leaves”*…
Lee Wilkins on the interconnected development of digital and textile technology…
I’ve always been fascinated with the co-evolution of computation and textiles. Some of the first industrialized machines produced elaborate textiles on a mass scale, the most famous example of which is the jacquard loom. It used punch cards to create complex designs programmatically, similar to the computer punch cards that were used until the 1970s. But craft work and computation have many parallel processes. The process of pulling wires is similar to the way yarn is made, and silkscreening is common in both fabric and printed circuit board production. Another of my favorite examples is rubylith, a light-blocking film used to prepare silkscreens for fabric printing and to imprint designs on integrated circuits.
Of course, textiles and computation have diverged on their evolutionary paths, but I love finding the places where they do converge – or inventing them myself. Recently, I’ve had the opportunity to work with a gigantic Tajima digital embroidery machine [see above]. This room-sized machine, affectionately referred to as The Spider Queen by the technician, loudly sews hundreds of stitches per minute – something that would take me months to make by hand. I’m using it to make large soft speaker coils by laying conductive fibers on a thick woven substrate. I’m trying to recreate functional coils – for use as radios, speakers, inductive power, and motors – in textile form. Given the shared history, I can imagine a parallel universe where embroidery is considered high-tech and computers a crafty hobby…
Notes, in @the_prepared.
* Ada Lovelace, programmer of the Analytical Engine, which was designed and built by her partner Charles Babbage
###
As we investigate intertwining, we might recall that it was on this date in 1922 that Frederick Banting and Charles Best announced their discovery of insulin the prior year (with James Collip). The co-inventors sold the insulin patent to the University of Toronto for a mere $1. They wanted everyone who needed their medication to be able to afford it.
Today, Banting and his colleagues would be spinning in their graves: their drug, one on which many of the 30 million Americans with diabetes rely, has become the poster child for pharmaceutical price gouging.
The cost of the four most popular types of insulin has tripled over the past decade, and the out-of-pocket prescription costs patients now face have doubled. By 2016, the average price per month rose to $450 — and costs continue to rise, so much so that as many as one in four people with diabetes are now skimping on or skipping lifesaving doses…
Best (left) and Bantling with with one of the diabetic dogs used in their experiments with insulin
“Plans are worthless, but planning is everything”*…
We’re living through a real-time natural experiment on a global scale. The differential performance of countries, cities and regions in the face of the COVID-19 pandemic is a live test of the effectiveness, capacity and legitimacy of governments, leaders and social contracts.
The progression of the initial outbreak in different countries followed three main patterns. Countries like Singapore and Taiwan represented Pattern A, where (despite many connections to the original source of the outbreak in China) vigilant government action effectively cut off community transmission, keeping total cases and deaths low. China and South Korea represented Pattern B: an initial uncontrolled outbreak followed by draconian government interventions that succeeded in getting at least the first wave of the outbreak under control.
Pattern C is represented by countries like Italy and Iran, where waiting too long to lock down populations led to a short-term exponential growth of new cases that overwhelmed the healthcare system and resulted in a large number of deaths. In the United States, the lack of effective and universally applied social isolation mechanisms, as well as a fragmented healthcare system and a significant delay in rolling out mass virus testing, led to a replication of Pattern C, at least in densely populated places like New York City and Chicago.
Despite the Chinese and Americans blaming each other and crediting their own political system for successful responses, the course of the virus didn’t score easy political points on either side of the new Cold War. Regime type isn’t correlated with outcomes. Authoritarian and democratic countries are included in each of the three patterns of responses: authoritarian China and democratic South Korea had effective responses to a dramatic breakout; authoritarian Singapore and democratic Taiwan both managed to quarantine and contain the virus; authoritarian Iran and democratic Italy both experienced catastrophe.
It’s generally a mistake to make long-term forecasts in the midst of a hurricane, but some outlines of lasting shifts are emerging. First, a government or society’s capacity for technical competence in executing plans matters more than ideology or structure. The most effective arrangements for dealing with the pandemic have been found in countries that combine a participatory public culture of information sharing with operational experts competently executing decisions. Second, hyper-individualist views of privacy and other forms of risk are likely to be submerged as countries move to restrict personal freedoms and use personal data to manage public and aggregated social risks. Third, countries that are able to successfully take a longer view of planning and risk management will be at a significant advantage…
From Steve Weber and @nils_gilman, an argument for the importance of operational expertise, plans for the long-term, and the socialization of some risks: “The Long Shadow Of The Future.”
* Dwight D. Eisenhower
###
As we make ourselves ready, we might recall that it was on this date in 1822 that Charles Babbage [see almanac entry here] proposes a Difference Engine in a paper to the Royal Astronomical Society (which he’d helped found two years earlier).
In Babbage’s time, printed mathematical tables were calculated by human computers… in other words, by hand. They were central to navigation, science, and engineering, as well as mathematics– but mistakes occurred, both in transcription and in calculation. Babbage determined to mechanize the process and to reduce– indeed, to eliminate– errors. His Difference Engine was intended as precisely that sort of mechanical calculator (in this instance, to compute values of polynomial functions).
In 1833 he began his programmable Analytical Machine (AKA, the Analytical Engine), the forerunner of modern computers, with coding help from Ada Lovelace, who created an algorithm for the Analytical Machine to calculate a sequence of Bernoulli numbers— for which she is remembered as the first computer programmer.

A portion of the difference engine
You must be logged in to post a comment.