Posts Tagged ‘history of computing’
“Reality is frequently inaccurate”*…
Machine learning and what it may teach us about reality…
Our latest paradigmatic technology, machine learning, may be revealing the everyday world as more accidental than rule-governed. If so, it will be because machine learning gains its epistemological power from its freedom from the sort of generalisations that we humans can understand or apply.
The opacity of machine learning systems raises serious concerns about their trustworthiness and their tendency towards bias. But the brute fact that they work could be bringing us to a new understanding and experience of what the world is and our role in it…
The world is a black box full of extreme specificity: it might be predictable but that doesn’t mean it is understandable: “Learn from Machine Learning,” by David Weinberger (@dweinberger) in @aeonmag.
(image above: source)
* Douglas Adams, The Restaurant at the End of the Universe
###
As ruminate on the real, we might send carefully-computed birthday greetings to Grace Brewster Murray Hopper. A seminal computer scientist and Rear Admiral in the U.S. Navy, “Amazing Grace” (as she was known to many in her field) was one of the first programmers of the Harvard Mark I computer (in 1944), invented the first compiler for a computer programming language, and was one of the leaders in popularizing the concept of machine-independent programming languages– which led to the development of COBOL, one of the first high-level programming languages.
Hopper also found and documented the first computer “bug” (in 1947).
She has both a ship (the guided-missile destroyer USS Hopper) and a super-computer (the Cray XE6 “Hopper” at NERSC) named in her honor.

“The future is already here – it’s just not evenly distributed”*…

Security, transportation, energy, personal “stuff”– the 2018 staff of Popular Mechanics, asked leading engineers and futurists for their visions of future cities, and built a handbook to navigate this new world: “The World of 2045.”
* William Gibson (in The Economist, December 4, 2003)
###
As we take the long view, we might spare a thought for Charles Babbage; he died on this date in 1871. A mathematician, philosopher, inventor, and mechanical engineer, Babbage is best remembered for originating the concept of a programmable computer. Anxious to eliminate inaccuracies in mathematical tables, he first built a small calculating machine able to compute squares. He then produced prototypes of portions of a larger Difference Engine. (Georg and Edvard Schuetz later constructed the first working devices to the same design, and found them successful in limited applications.) In 1833 he began his programmable Analytical Machine (AKA, the Analytical Engine), the forerunner of modern computers, with coding help from Ada Lovelace, who created an algorithm for the Analytical Machine to calculate a sequence of Bernoulli numbers— for which she is remembered as the first computer programmer.
Babbage’s other inventions include the cowcatcher, the dynamometer, the standard railroad gauge, uniform postal rates, occulting lights for lighthouses, Greenwich time signals, and the heliograph opthalmoscope. A true hacker, he was also passionate about cyphers and lock-picking.
Prime News!…
Readers will know that this space follows the hunt for new prime numbers, and more particularly, for new “Mersenne primes.”
Now, as ArsTechnica reports:
A new largest prime number has been discovered, mersenne.org reported Tuesday. 257,885,161-1, which is also the 48th Mersenne prime, was discovered on the computer of Dr. Curtis Cooper, a professor at the University of Central Missouri.
A Mersenne prime is a prime number that can be written in the form Mp = 2n-1, and they’re extremely rare finds. Of all the numbers between 0 and 25,964,951 there are 1,622,441 that are prime, but only 42 are Mersenne primes.
The 48th Mersenne prime was discovered as part of the Great Internet Mersenne Prime Search (GIMPS), a project that has used volunteer computers to calculate and search for primes for 17 years. Dr. Cooper’s computer took 39 days of continuous calculation to verify the prime status of the number, which has over 17 million digits…
Read the full story here; the report from Mersenne.org here; and visit the Mersenne/GIMPS homepage here.
###
As we regret the limited number of fingers and toes with which we were born, we might recall that it was on this date in 1946 that the first ancestor of Dr. Cooper’s computer– the ENIAC (Electronic Numerical Integrator And Computer)– was first demonstrated in operation. (It was announced to the public te following day.) The first general-purpose computer (Turing-complete, digital, and capable of being programmed and re-programmed to solve different problems), ENIAC was begun in 1943, as part of the U.S’s war effort (as a classified military project known as “Project PX”); it was conceived and designed by John Mauchly and Presper Eckert of the University of Pennsylvania, where it was built. The finished machine, composed of 17,468 electronic vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints, weighed more than 27 tons and occupied a 30 x 50 foot room– in its time the largest single electronic apparatus in the world. ENIAC’s basic clock speed was 100,000 cycles per second. Today’s home computers have clock speeds of 1,000,000,000 cycles per second; Dr, Gordon’s, much faster still.






You must be logged in to post a comment.