(Roughly) Daily

Posts Tagged ‘computing

“A flash of revelation and a flash of response”*…

 

“A Cellar Dive in the Bend,” c.1895, by Richard Hoe Lawrence and Henry G. Piffard

All photography requires light, but the light used in flash photography is unique — shocking, intrusive and abrupt. It’s quite unlike the light that comes from the sun, or even from ambient illumination. It explodes, suddenly, into darkness.

The history of flash goes right back to the challenges faced by early photographers who wanted to use their cameras in places where there was insufficient light — indoors, at night, in caves. The first flash photograph was probably a daguerreotype of a fossil, taken in 1839 by burning limelight…

In its early days, a sense of quasi-divine revelation was invoked by some flash photographers, especially when documenting deplorable social conditions. Jacob Riis, for example, working in New York in the late 1880s, used transcendental language to help underscore flash’s significance as an instrument of intervention and purgation. But it’s in relation to documentary photography that we encounter most starkly flash’s singular, and contradictory, aspects. It makes visible that which would otherwise remain in darkness; but it is often associated with unwelcome intrusion, a rupturing of private lives and interiors.

Yet flash brings a form of democracy to the material world. Many details take on unplanned prominence, as we see in the work of those Farm Security Administration photographers who used flash in the 1930s and laid bare the reality of poverty during the Depression. A sudden flare of light reveals each dent on a kitchen utensil and the label on each carefully stored can; each photograph on the mantel; each cherished ornament; each little heap of waste paper or discarded rag; each piece of polished furniture or stained floor or accumulation of dust; each wrinkle. Flash can make plain, bring out of obscurity, the appearance of things that may never before have been seen with such clarity…

Find illumination at “A short history of flash photography.”

* J.M. Coetzee, Disgrace

###

As we glory in the glare, we might send elegantly-calculated birthday greetings to Augusta Ada King-Noel, Countess of Lovelace (née Byron); she was born on this date in 1815.  The daughter of the poet Lord Byron, she was the author of what can reasonably be considered the first “computer program”– so one of the “parents” of the modern computer.  Her work was in collaboration with her long-time friend and thought partner Charles Babbage (known as “the father of computers”), in particular, in conjunction with Babbage’s work on the Analytical Engine.

Ada, Countess of Lovelace, 1840

source

 

 

Written by LW

December 10, 2017 at 1:01 am

“It is not enough for code to work”*…

 

It’s been said that software is “eating the world.” More and more, critical systems that were once controlled mechanically, or by people, are coming to depend on code. This was perhaps never clearer than in the summer of 2015, when on a single day, United Airlines grounded its fleet because of a problem with its departure-management system; trading was suspended on the New York Stock Exchange after an upgrade; the front page of The Wall Street Journal’s website crashed; and Seattle’s 911 system went down again, this time because a different router failed. The simultaneous failure of so many software systems smelled at first of a coordinated cyberattack. Almost more frightening was the realization, late in the day, that it was just a coincidence…

Our standard framework for thinking about engineering failures—reflected, for instance, in regulations for medical devices—was developed shortly after World War II, before the advent of software, for electromechanical systems. The idea was that you make something reliable by making its parts reliable (say, you build your engine to withstand 40,000 takeoff-and-landing cycles) and by planning for the breakdown of those parts (you have two engines). But software doesn’t break… Software failures are failures of understanding, and of imagination…

Invisible– but all too real and painful– problems, and the attempts to make them visible: “The Coming Software Apocalypse.”

* Robert C. Martin, Clean Code: A Handbook of Agile Software Craftsmanship

###

As we Code for America, we might recall that it was on this date in 1983 that Microsoft released its first software application, Microsoft Word 1.0.  For use with MS-DOS compatible systems, Word was the first word processing software to make extensive use of a computer mouse. (Not coincidentally, Microsoft had released a computer mouse for IBM-compatible PCs earlier in the year.)  A free demo version of Word was included with the current edition of PC World—  the first time a floppy disk was included with a magazine.

 source

 

Written by LW

September 29, 2017 at 1:01 am

“Artificial intelligence is growing up fast”*…

 

Every moment of your waking life and whenever you dream, you have the distinct inner feeling of being “you.” When you see the warm hues of a sunrise, smell the aroma of morning coffee or mull over a new idea, you are having conscious experience. But could an artificial intelligence (AI) ever have experience, like some of the androids depicted in Westworld or the synthetic beings in Blade Runner?

The question is not so far-fetched. Robots are currently being developed to work inside nuclear reactors, fight wars and care for the elderly. As AIs grow more sophisticated, they are projected to take over many human jobs within the next few decades. So we must ponder the question: Could AIs develop conscious experience?…

It’s not easy, but a newly proposed test might be able to detect consciousness in a machine: “Is anyone home? A way to find out if AI has become self-aware.

* Diane Ackerman

###

As we ponder personhood, we might recall that it was on this date in 1967 that US Navy recalled Captain Grace Murray Hopper to active duty to help develop the programming language COBOL.  With a team drawn from several computer manufacturers and the Pentagon, Hopper – who had worked on the Mark I and II computers at Harvard in the 1940s – created the specifications for COBOL (COmmon Business Oriented Language) with business uses in mind.  These early COBOL efforts aimed at creating easily-readable computer programs with as much machine independence as possible.

A seminal computer scientist and ultimately Rear Admiral in the U.S. Navy, “Amazing Grace” (as she was known to many in her field) had invented the first compiler for a computer programming language, and appears also to have also been the first to coin the word “bug” in the context of computer science, taping into her logbook a moth which had fallen into a relay of the Harvard Mark II computer.

She has both a ship (the guided-missile destroyer USS Hopper) and a super-computer (the Cray XE6 “Hopper” at NERSC) named in her honor.

 source [and here]

 

Written by LW

August 1, 2017 at 1:01 am

“I wonder if computers ever dream of humans”*…

 

How old are the fields of robotics and artificial intelligence? Many might trace their origins to the mid-twentieth century, and the work of people such as Alan Turing, who wrote about the possibility of machine intelligence in the ‘40s and ‘50s, or the MIT engineer Norbert Wiener, a founder of cybernetics. But these fields have prehistories — traditions of machines that imitate living and intelligent processes — stretching back centuries and, depending how you count, even millennia…

Defecating ducks [see here], talking busts, and mechanized Christs — Jessica Riskin on the wonderful history of automata, machines built to mimic the processes of intelligent life: “Frolicsome Engines: The Long Prehistory of Artificial Intelligence.”

* David Mitchell, Ghostwritten

###

As we take the Turing Test, we might spare a thought for Eadweard Muybridge; he died on this date in 1904. Best remembered now for his pioneering work in photographic studies of motion (created for former California Governor Leland Stanford to help settle a bet), and early work in motion-picture projection, he was famous in his own day for his large photographs of Yosemite Valley.  The approaches he developed for the study of motion are at the heart of both animation and computer analysis today.

 source

 source

 

Written by LW

May 8, 2016 at 1:01 am

“A computer once beat me at chess, but it was no match for me at kick boxing”*…

 

J. Presper Eckert, foreground left, and John W. Mauchly, leaning against pole, are pictured with the Electronic Numerical Integrator and Computer (ENIAC) at the University of Pennsylvania in 1946. Mauchly and Eckert were the masterminds behind ENIAC, arguably the first modern computer. When it was fully operational, ENIAC filled up a room 30 x 50 feet and weighed 50 tons. Every second it was on, it used enough electricity to power a typical Philadelphia home for a week and a half.

 

The ENIAC— or least a good bit of it– has been saved…

Eccentric billionaires are tough to impress, so their minions must always think big when handed vague assignments. Ross Perot’s staffers did just that in 2006, when their boss declared that he wanted to decorate his Plano, Texas, headquarters with relics from computing history. Aware that a few measly Apple I’s and Altair 880’s wouldn’t be enough to satisfy a former presidential candidate, Perot’s people decided to acquire a more singular prize: a big chunk of ENIAC, the “Electronic Numerical Integrator And Computer.” The ENIAC was a 27-ton, 1,800-square-foot bundle of vacuum tubes and diodes that was arguably the world’s first true computer. The hardware that Perot’s team diligently unearthed and lovingly refurbished is now accessible to the general public for the first time, back at the same Army base where it almost rotted into oblivion…

Read the whole story– and see more photos of computing, v1.0– at “How the World’s First Computer Was Rescued From the Scrap Heap.”

* Emo Philips

###

As we praise the preservationists, we might recall that it was on this date in 1967 that Jocelyn Bell Burnell and Antony Hewish observed the first pulsar– “pulsating radio star.”  A highly-magnetized, rotating neutron star, a pulsar emits a beam of electromagnetic radiation that can only be detected on Earth when it is being beamed in our direction (so seems, from Earth’s vantage, to be pulsing).  Pulsars have short, regular rotational periods, so produce the pulses that we detect at very precise intervals.

Schematic view of a pulsar. The sphere in the middle represents the neutron star, the curves indicate the magnetic field lines, the protruding cones represent the emission beams and the green line represents the axis on which the star rotates.

source

 

Written by LW

November 28, 2014 at 1:01 am

Hello?… Hello?…

Quoth the always-amusing Tyler Hellard: “ConferenceCall.biz is a spectacular display of existential despair and the modern condition.”

And indeed it is.

###

As we remember that we can press “8” to mute at any time, we might email elegantly and creatively designed birthday greetings to Douglas Carl Engelbart; he was born on this date in 1925.  An engineer and inventor who was a computing and internet pioneer, Doug (who passed away last year) is best remembered for his seminal work on human-computer interface issues, and for “the Mother of All Demos” in 1968, at which he demonstrated for the first time the computer mouse, hypertext, networked computers, and the earliest versions of graphical user interfaces… that’s to say, computing as we know it.

 source

Written by LW

January 30, 2014 at 1:01 am

%d bloggers like this: