(Roughly) Daily

Posts Tagged ‘Grace Hopper

“The brain is a wonderful organ; it starts working the moment you get up in the morning and does not stop until you get into the office”*…

For as long as humans have thought, humans have thought about thinking. George Cave on the power and the limits of the metaphors we’ve used to do that…

For thousands of years, humans have described their understanding of intelligence with engineering metaphors. In the 3rd century BCE, the invention of hydraulics popularized the model of fluid flow (“humours”) in the body. This lasted until the 1500s, supplanted by the invention of automata and the idea of humans as complex machines. From electrical and chemical metaphors in the 1700s to advances in communications a century later, each metaphor reflected the most advanced thinking of that era. Today is no different: we talk of brains that store, process and retrieve memories, mirroring the language of computers.

I’ve always believed metaphors to be helpful and productive in communicating unfamiliar concepts. But this fascinating history of cognitive science metaphors shows that flawed metaphors can take hold and limit the scope for alternative ideas. In the worst case, the EU spent 10 years and $1.3 billion building a model of the brain based on the incorrect belief that the brain functions like a computer…

Thinking about thinking, from @George_Cave in @the_prepared.

Apposite: “Finding Language in the Brain.”

* Robert Frost

###

As we cogitate on cognition, we might send carefully-computed birthday greetings to Grace Brewster Murray Hopper.  A seminal computer scientist and Rear Admiral in the U.S. Navy, “Amazing Grace” (as she was known to many in her field) was one of the first programmers of the Harvard Mark I computer (in 1944), invented the first compiler for a computer programming language, and was one of the leaders in popularizing the concept of machine-independent programming languages– which led to the development of COBOL, one of the first high-level programming languages.

Hopper also (inadvertently) contributed one of the most ubiquitous metaphors in computer science: she found and documented the first computer “bug” (in 1947).

She has both a ship (the guided-missile destroyer USS Hopper) and a super-computer (the Cray XE6 “Hopper” at NERSC) named in her honor.

 source

Written by (Roughly) Daily

December 9, 2022 at 1:00 am

“Reality is frequently inaccurate”*…

Machine learning and what it may teach us about reality…

Our latest paradigmatic technology, machine learning, may be revealing the everyday world as more accidental than rule-governed. If so, it will be because machine learning gains its epistemological power from its freedom from the sort of generalisations that we humans can understand or apply.

The opacity of machine learning systems raises serious concerns about their trustworthiness and their tendency towards bias. But the brute fact that they work could be bringing us to a new understanding and experience of what the world is and our role in it…

The world is a black box full of extreme specificity: it might be predictable but that doesn’t mean it is understandable: “Learn from Machine Learning,” by David Weinberger (@dweinberger) in @aeonmag.

(image above: source)

* Douglas Adams, The Restaurant at the End of the Universe

###

As ruminate on the real, we might send carefully-computed birthday greetings to Grace Brewster Murray Hopper.  A seminal computer scientist and Rear Admiral in the U.S. Navy, “Amazing Grace” (as she was known to many in her field) was one of the first programmers of the Harvard Mark I computer (in 1944), invented the first compiler for a computer programming language, and was one of the leaders in popularizing the concept of machine-independent programming languages– which led to the development of COBOL, one of the first high-level programming languages.

Hopper also found and documented the first computer “bug” (in 1947).

She has both a ship (the guided-missile destroyer USS Hopper) and a super-computer (the Cray XE6 “Hopper” at NERSC) named in her honor.

 source

“We often plough so much energy into the big picture, we forget the pixels”*…

Alvy Ray Smith (see also here) was born before computers, made his first computer graphic in 1964, cofounded Pixar, was the first director of computer graphics at Lucasfilm, and the first graphics fellow at Microsoft. He is the author of the terrific new book A Biography of the Pixel (2021), from which, this excerpt…

I have billions of pixels in my cellphone, and you probably do too. But what is a pixel? Why do so many people think that pixels are little abutting squares? Now that we’re aswim in an ocean of zettapixels (21 zeros), it’s time to understand what they are. The underlying idea – a repackaging of infinity – is subtle and beautiful. Far from being squares or dots that ‘sort of’ approximate a smooth visual scene, pixels are the profound and exact concept at the heart of all the images that surround us – the elementary particles of modern pictures.

This brief history of the pixel begins with Joseph Fourier in the French Revolution and ends in the year 2000 – the recent millennium. I strip away the usual mathematical baggage that hides the pixel from ordinary view, and then present a way of looking at what it has wrought.

The millennium is a suitable endpoint because it marked what’s called the great digital convergence, an immense but uncelebrated event, when all the old analogue media types coalesced into the one digital medium. The era of digital light – all pictures, for whatever purposes, made of pixels – thus quietly began. It’s a vast field: books, movies, television, electronic games, cellphones displays, app interfaces, virtual reality, weather satellite images, Mars rover pictures – to mention a few categories – even parking meters and dashboards. Nearly all pictures in the world today are digital light, including nearly all the printed words. In fact, because of the digital explosion, this includes nearly all the pictures ever made. Art museums and kindergartens are among the few remaining analogue bastions, where pictures fashioned from old media can reliably be found…

An exact mathematical concept, pixels are the elementary particles of pictures, based on a subtle unpacking of infinity: “Pixel: a biography,” from @alvyray.

Dame Silvia Cartwright

###

As we ruminate on resolution, we might recall that it was on this date in 1947 that fabled computer scientist Grace Hopper (see here and here), then a programmer at Harvard’s Harvard’s Mark II Aiken Relay computer, found and documented the first computer “bug”– an insect that had lodged in the works.  The incident is recorded in Hopper’s logbook alongside the offending moth, taped to the logbook page: “15:45 Relay #70 Panel F (moth) in relay. First actual case of bug being found.”

This anecdote has led to Hopper being pretty widely credited with coining the term “bug” (and ultimately “de-bug”) in its technological usage… but the term actually dates back at least to Thomas Edison…

bug
Grace Hopper’s log entry (source)

Written by (Roughly) Daily

September 9, 2021 at 1:00 am

“Typography is the craft of endowing human language with a durable visual form”*…

Although the modern design world continues its well-documented love affair with the look and feel of letterpress, the once highly regarded trade of printing press operation has largely faded out as a career path, giving way to the relentless growth of digital printing methods.

Ireland’s National Print Museum in Dublin was founded in 1996 by retired printers who couldn’t bear to watch their trades disappear without trace or fanfare. “The Chapel”, a core group of volunteers (mostly retirees), are dedicated to keeping the museum’s collection of historical printing machines — and the skills required to operate them — from fading away as well…

In Great Britain, a collective of union printers is known as a “chapel.” While the exact origins are unknown, the term can be traced back to William Caxton, credited with bringing the first printing press to England in 1476…

A glorious photographic tour of “The Chapel: Inside Ireland’s National Print Museum.”

* Robert Bringhurst, The Elements of Typographic Style

###

As we love the lead, we might recall that it was on this date in 1947 that fabled computer scientist Grace Hopper (see here and here), then a programmer at Harvard’s Harvard’s Mark II Aiken Relay computer, found and documented the first computer “bug”– an insect that had lodged in the works.  The incident is recorded in Hopper’s logbook alongside the offending moth, taped to the logbook page: “15:45 Relay #70 Panel F (moth) in relay. First actual case of bug being found.”

This anecdote has led to Hopper being pretty widely credited with coining the term “bug” (and ultimately “de-bug”) in its technological usage… but the term actually dates back at least to Thomas Edison…

bug
Grace Hoppers log entry (source)

Written by (Roughly) Daily

September 9, 2020 at 1:01 am

“Please cut off a nanosecond and send it over to me”*…

960px-Commodore_Grace_M._Hopper,_USN_(covered)

 

“Amazing  Grace” Hopper, seminal computer scientist and Rear Admiral in the U.S. Navy, explains a “nanosecond”…

* Grace Hopper

###

As we celebrate clarity, we might recall that it was on this date in 1961 that Robert Noyce was issued patent number 2981877 for his “semiconductor device-and-lead structure,” the first patent for what would come to be known as the integrated circuit.  In fact another engineer, Jack Kilby, had separately and essentially simultaneously developed the same technology (Kilby’s design was rooted in germanium; Noyce’s in silicon) and had filed a few months earlier than Noyce… a fact that was recognized in 2000 when Kilby was Awarded the Nobel Prize– in which Noyce, who had died in 1990, did not share.

Noyce (left) and Kilby (right)

 source

 

 

 

Written by (Roughly) Daily

April 25, 2019 at 1:01 am

%d bloggers like this: