(Roughly) Daily

Posts Tagged ‘Charles Babbage

“No problem can be solved from the same level of consciousness that created it”*…

Christof Koch settles his bet with David Chalmers (with a case of wine)

… perhaps especially not the problem of consciousness itself. At least for now…

A 25-year science wager has come to an end. In 1998, neuroscientist Christof Koch bet philosopher David Chalmers that the mechanism by which the brain’s neurons produce consciousness would be discovered by 2023. Both scientists agreed publicly on 23 June, at the annual meeting of the Association for the Scientific Study of Consciousness (ASSC) in New York City, that it is still an ongoing quest — and declared Chalmers the winner.

What ultimately helped to settle the bet was a key study testing two leading hypotheses about the neural basis of consciousness, whose findings were unveiled at the conference.

“It was always a relatively good bet for me and a bold bet for Christof,” says Chalmers, who is now co-director of the Center for Mind, Brain and Consciousness at New York University. But he also says this isn’t the end of the story, and that an answer will come eventually: “There’s been a lot of progress in the field.”

Consciousness is everything a person experiences — what they taste, hear, feel and more. It is what gives meaning and value to our lives, Chalmers says.

Despite a vast effort — and a 25-year bet — researchers still don’t understand how our brains produce it, however. “It started off as a very big philosophical mystery,” Chalmers adds. “But over the years, it’s gradually been transmuting into, if not a ‘scientific’ mystery, at least one that we can get a partial grip on scientifically.”…

Neuroscientist Christof Koch wagered philosopher David Chalmers 25 years ago that researchers would learn how the brain achieves consciousness by now. But the quest continues: “Decades-long bet on consciousness ends — and it’s philosopher 1, neuroscientist 0,” from @Nature. Eminently worth reading in full for background and state-of-play.

* Albert Einstein

###

As we ponder pondering, we might spare a thought for Vannevar Bush; he died on this date in 1974. An engineer, inventor, and science administrator, he headed the World War II U.S. Office of Scientific Research and Development (OSRD), through which almost all wartime military R&D was carried out, including important developments in radar and the initiation and early administration of the Manhattan Project. He emphasized the importance of scientific research to national security and economic well-being, and was chiefly responsible for the movement that led to the creation of the National Science Foundation.

Bush also did his own work. Before the war, in 1925, at age 35, he developed the differential analyzer, the world’s first analog computer, capable of solving differential equations. It put into productive form, the mechanical concept left incomplete by Charles Babbage, 50 years earlier; and theoretical work by Lord Kelvin. The machine filled a 20×30 ft room. He seeded ideas later adopted as internet hypertext links.

source

“We may say most aptly that the Analytical Engine weaves algebraical patterns just as the Jacquard loom weaves flowers and leaves”*…

Lee Wilkins on the interconnected development of digital and textile technology…

I’ve always been fascinated with the co-evolution of computation and textiles. Some of the first industrialized machines produced elaborate textiles on a mass scale, the most famous example of which is the jacquard loom. It used punch cards to create complex designs programmatically, similar to the computer punch cards that were used until the 1970s. But craft work and computation have many parallel processes. The process of pulling wires is similar to the way yarn is made, and silkscreening is common in both fabric and printed circuit board production. Another of my favorite examples is rubylith, a light-blocking film used to prepare silkscreens for fabric printing and to imprint designs on integrated circuits.

Of course, textiles and computation have diverged on their evolutionary paths, but I love finding the places where they do converge – or inventing them myself. Recently, I’ve had the opportunity to work with a gigantic Tajima digital embroidery machine [see above]. This room-sized machine, affectionately referred to as The Spider Queen by the technician, loudly sews hundreds of stitches per minute – something that would take me months to make by hand. I’m using it to make large soft speaker coils by laying conductive fibers on a thick woven substrate. I’m trying to recreate functional coils – for use as radios, speakers, inductive power, and motors – in textile form. Given the shared history, I can imagine a parallel universe where embroidery is considered high-tech and computers a crafty hobby…

Notes, in @the_prepared.

Ada Lovelace, programmer of the Analytical Engine, which was designed and built by her partner Charles Babbage

###

As we investigate intertwining, we might recall that it was on this date in 1922 that Frederick Banting and Charles Best announced their discovery of insulin the prior year (with James Collip). The co-inventors sold the insulin patent to the University of Toronto for a mere $1. They wanted everyone who needed their medication to be able to afford it.

Today, Banting and his colleagues would be spinning in their graves: their drug, one on which many of the 30 million Americans with diabetes rely, has become the poster child for pharmaceutical price gouging.

The cost of the four most popular types of insulin has tripled over the past decade, and the out-of-pocket prescription costs patients now face have doubled. By 2016, the average price per month rose to $450 — and costs continue to rise, so much so that as many as one in four people with diabetes are now skimping on or skipping lifesaving doses

Best (left) and Bantling with with one of the diabetic dogs used in their experiments with insulin

source

“The future is already here – it’s just not evenly distributed”*…

 

future

Security, transportation, energy, personal “stuff”– the 2018 staff of Popular Mechanics, asked leading engineers and futurists for their visions of future cities, and built a handbook to navigate this new world: “The World of 2045.”

* William Gibson (in The Economist, December 4, 2003)

###

As we take the long view, we might spare a thought for Charles Babbage; he died on this date in 1871. A mathematician, philosopher, inventor, and mechanical engineer, Babbage is best remembered for originating the concept of a programmable computer. Anxious to eliminate inaccuracies in mathematical tables, he first built a small calculating machine able to compute squares.  He then produced prototypes of portions of a larger Difference Engine. (Georg and Edvard Schuetz later constructed the first working devices to the same design, and found them successful in limited applications.)  In 1833 he began his programmable Analytical Machine (AKA, the Analytical Engine), the forerunner of modern computers, with coding help from Ada Lovelace, who created an algorithm for the Analytical Machine to calculate a sequence of Bernoulli numbers— for which she is remembered as the first computer programmer.

Babbage’s other inventions include the cowcatcher, the dynamometer, the standard railroad gauge, uniform postal rates, occulting lights for lighthouses, Greenwich time signals, and the heliograph opthalmoscope.  A true hacker, he was also passionate about cyphers and lock-picking.

 source

 

Written by (Roughly) Daily

October 18, 2018 at 1:01 am

“A flash of revelation and a flash of response”*…

 

“A Cellar Dive in the Bend,” c.1895, by Richard Hoe Lawrence and Henry G. Piffard

All photography requires light, but the light used in flash photography is unique — shocking, intrusive and abrupt. It’s quite unlike the light that comes from the sun, or even from ambient illumination. It explodes, suddenly, into darkness.

The history of flash goes right back to the challenges faced by early photographers who wanted to use their cameras in places where there was insufficient light — indoors, at night, in caves. The first flash photograph was probably a daguerreotype of a fossil, taken in 1839 by burning limelight…

In its early days, a sense of quasi-divine revelation was invoked by some flash photographers, especially when documenting deplorable social conditions. Jacob Riis, for example, working in New York in the late 1880s, used transcendental language to help underscore flash’s significance as an instrument of intervention and purgation. But it’s in relation to documentary photography that we encounter most starkly flash’s singular, and contradictory, aspects. It makes visible that which would otherwise remain in darkness; but it is often associated with unwelcome intrusion, a rupturing of private lives and interiors.

Yet flash brings a form of democracy to the material world. Many details take on unplanned prominence, as we see in the work of those Farm Security Administration photographers who used flash in the 1930s and laid bare the reality of poverty during the Depression. A sudden flare of light reveals each dent on a kitchen utensil and the label on each carefully stored can; each photograph on the mantel; each cherished ornament; each little heap of waste paper or discarded rag; each piece of polished furniture or stained floor or accumulation of dust; each wrinkle. Flash can make plain, bring out of obscurity, the appearance of things that may never before have been seen with such clarity…

Find illumination at “A short history of flash photography.”

* J.M. Coetzee, Disgrace

###

As we glory in the glare, we might send elegantly-calculated birthday greetings to Augusta Ada King-Noel, Countess of Lovelace (née Byron); she was born on this date in 1815.  The daughter of the poet Lord Byron, she was the author of what can reasonably be considered the first “computer program”– so one of the “parents” of the modern computer.  Her work was in collaboration with her long-time friend and thought partner Charles Babbage (known as “the father of computers”), in particular, in conjunction with Babbage’s work on the Analytical Engine.

Ada, Countess of Lovelace, 1840

source

 

 

Written by (Roughly) Daily

December 10, 2017 at 1:01 am

“in this case there were three determinate states the cat could be in: these being Alive, Dead, and Bloody Furious”*…

 

Of all the bizarre facets of quantum theory, few seem stranger than those captured by Erwin Schrödinger’s famous fable about the cat that is neither alive nor dead. It describes a cat locked inside a windowless box, along with some radioactive material. If the radioactive material happens to decay, then a device releases a hammer, which smashes a vial of poison, which kills the cat. If no radioactivity is detected, the cat lives. Schrödinger dreamt up this gruesome scenario to mock what he considered a ludicrous feature of quantum theory. According to proponents of the theory, before anyone opened the box to check on the cat, the cat was neither alive nor dead; it existed in a strange, quintessentially quantum state of alive-and-dead.

Today, in our LOLcats-saturated world, Schrödinger’s strange little tale is often played for laughs, with a tone more zany than somber. It has also become the standard bearer for a host of quandaries in philosophy and physics. In Schrödinger’s own time, Niels Bohr and Werner Heisenberg proclaimed that hybrid states like the one the cat was supposed to be in were a fundamental feature of nature. Others, like Einstein, insisted that nature must choose: alive or dead, but not both.

Although Schrödinger’s cat flourishes as a meme to this day, discussions tend to overlook one key dimension of the fable: the environment in which Schrödinger conceived it in the first place. It’s no coincidence that, in the face of a looming World War, genocide, and the dismantling of German intellectual life, Schrödinger’s thoughts turned to poison, death, and destruction. Schrödinger’s cat, then, should remind us of more than the beguiling strangeness of quantum mechanics. It also reminds us that scientists are, like the rest of us, humans who feel—and fear…

More of this sad story at “How Einstein and Schrödinger Conspired to Kill a Cat.”

* Terry Patchett

###

As we refrain from lifting the box’s lid, we might spare a thought for Charles Babbage; he died on this date in 1871.  A mathematician, philosopher, inventor and mechanical engineer, Babbage is best remembered for originating the concept of a programmable computer.  Anxious to eliminate inaccuracies in mathematical tables. By 1822, he built small calculating machine able to compute squares (1822).  He then produced prototypes of portions of a larger Difference Engine. (Georg and Edvard Schuetz later constructed the first working devices to the same design which were successful in limited applications.)  In 1833 he began his programmable Analytical Machine (AKA, the Analytical Engine), the forerunner of modern computers, with coding help from Ada Lovelace, who created an algorithm for the Analytical Machine to calculate a sequence of Bernoulli numbers— for which she is remembered as the first computer programmer.

Babbage’s other inventions include the cowcatcher, the dynamometer, the standard railroad gauge, uniform postal rates, occulting lights for lighthouses, Greenwich time signals, and the heliograph opthalmoscope.  He was also passionate about cyphers and lock-picking.

 source

 

Written by (Roughly) Daily

October 18, 2016 at 1:01 am

%d bloggers like this: