(Roughly) Daily

Posts Tagged ‘John McCarthy

“It was orderly, like the universe. It had logic. It was dependable. Using it allowed a kind of moral uplift, as one’s own chaos was also brought under control.”*…

(Roughly) Daily has looked before at the history of the filing cabinet, rooted in the work of Craig Robertson (@craig2robertson). He has deepened his research and published a new book, The Filing Cabinet: A Vertical History of Information. An Xiao Mina offers an appreciation– and a consideration of one of the central questions it raises: can emergent knowledge coexist with an internet that privileges the kind “certainty” that’s implicit in the filing paradigm that was born with the filing cabinet and that informs our “knowledge systems” today…

… The 20th century saw an emergent information paradigm shaped by corporate capitalism, which emphasized maximizing profit and minimizing the time workers spent on tasks. Offices once kept their information in books—think Ebenezer Scrooge with his quill pen, updating his thick ledger on Christmas. The filing cabinet changed all that, encouraging what Robertson calls “granular certainty,” or “the drive to break more and more of life and its everyday routines into discrete, observable, and manageable parts.” This represented an important conceptualization: Information became a practical unit of knowledge that could be standardized, classified, and effortlessly stored and retrieved.

Take medical records, which require multiple layers of organization to support routine hospital business. “At the Bryn Mawr Hospital,” Robertson writes, “six different card files provided access to patient information: an alphabetical file of admission cards for discharged patients, an alphabetical file for the accident ward, a file to record all operations, a disease file, a diagnostic file, and a doctors’ file that recorded the number of patients each physician referred to the hospital.” The underlying logic of this system was that the storage of medical records didn’t just keep them safe; it made sure that those records could be accessed easily.

Robertson’s deep focus on the filing cabinet grounds the book in history and not historical analogy. He touches very little on Big Data and indexing and instead dives into the materiality of the filing cabinet and the principles of information management that guided its evolution. But students of technology and information studies will immediately see this history shaping our world today…

[And] if the filing cabinet, as a tool of business and capital, guides how we access digital information today, its legacy of certainty overshadows the messiness intrinsic to acquiring knowledge—the sort that requires reflection, contextualization, and good-faith debate. Ask the internet difficult questions with complex answers—questions of philosophy, political science, aesthetics, perception—and you’ll get responses using the same neat little index cards with summaries of findings. What makes for an ethical way of life? What is the best English-language translation of the poetry of Borges? What are the long-term effects of social inequalities, and how do we resolve them? Is it Yanny or Laurel?

Information collection and distribution today tends to follow the rigidity of cabinet logic to its natural extreme, but that bias leaves unattended more complex puzzles. The human condition inherently demands a degree of comfort with uncertainty and ambiguity, as we carefully balance incomplete and conflicting data points, competing value systems, and intricate frameworks to arrive at some form of knowing. In that sense, the filing cabinet, despite its deep roots in our contemporary information architecture, is just one step in our epistemological journey, not its end…

A captivating new history helps us see a humble appliance’s sweeping influence on modern life: “The Logic of the Filing Cabinet Is Everywhere.”

* Jeanette Winterson, Why Be Happy When You Could Be Normal?

###

As we store and retrieve, we might recall that it was on this date in 19955 that the term “artificial intelligence” was coined in a proposal for a “2 month, 10 man study of artificial intelligence” submitted by John McCarthy (Dartmouth College), Marvin Minsky (Harvard University), Nathaniel Rochester (IBM), and Claude Shannon (Bell Telephone Laboratories). The workshop, which took place at Dartmouth a year later, in July and August 1956, is generally recognized as the official birth date of the new field. 

Dartmouth Conference attendees: Marvin Minsky, Claude Shannon, Ray Solomonoff and other scientists at the Dartmouth Summer Research Project on Artificial Intelligence (Photo: Margaret Minsky)

source

“God help us — for art is long, and life so short”*…

 

The creation of a homunculus, an artificially made miniature human, from an 1899 edition of Goethe’s Faust

Making life artificially wasn’t as big a deal for the ancients as it is for us. Anyone was supposed to be able to do it with the right recipe, just like baking bread. The Roman poet Virgil described a method for making synthetic bees, a practice known as bougonia, which involved beating a poor calf to death, blocking its nose and mouth, and leaving the carcass on a bed of thyme and cinnamon sticks. “Creatures fashioned wonderfully appear,” he wrote, “first void of limbs, but soon awhir with wings.”

This was, of course, simply an expression of the general belief in spontaneous generation: the idea that living things might arise from nothing within a fertile matrix of decaying matter. Roughly 300 years earlier, Aristotle, in his book On the Generation of Animals, explained how this process yielded vermin, such as insects and mice. No one doubted it was possible, and no one feared it either (apart from the inconvenience); one wasn’t “playing God” by making new life this way.

The furor that has sometimes accompanied the new science of synthetic biology—the attempt to reengineer living organisms as if they were machines for us to tinker with, or even to build them from scratch from the component parts—stems from a decidedly modern construct, a “reverence for life.” In the past, fears about this kind of technological hubris were reserved mostly for proposals to make humans by artificial means—or as the Greeks would have said, by techne, art…

Philip Ball digs into myth, history, and science to untangle the roots of our fears of artificial life: “Man Made: A History of Synthetic Life.”

* Johann Wolfgang von Goethe, Faust: Part One

###

As we marvel that “it’s alive!,” we might send carefully-coded birthday greetings to John McCarthy; he was born on this date in 1927.  An eminent computer and cognitive scientist– he was awarded both the Turning Prize and the National Medal of Science– McCarthy coined the phrase “artificial Intelligence” to describe the field of which he was a founder.

It was McCarthy’s 1979 article, “Ascribing Mental Qualities to Machines” (in which he wrote, “Machines as simple as thermostats can be said to have beliefs, and having beliefs seems to be a characteristic of most machines capable of problem solving performance”) that provoked John Searle‘s 1980 disagreement in the form of his famous Chinese Room Argument… provoking a broad debate that continues to this day.

 source

 

 

Written by (Roughly) Daily

September 4, 2016 at 1:01 am

Monsters of Grok…

 

From Amorphia Apparel:

Fake band t-shirts for history’s greatest minds:  I don’t know about you, but I think science and philosophy are pretty bad ass, so join me in rocking out with some the most influential thinkers of all time!

More sagacious shirts at Amorphia Apparel

 

As we opt for ring-spun wisdom, we might wish a thoughtful Happy Birthday to cognitive and computer scientist John McCarthy; he was born on this date in 1927.  A recipient of the Turing Award, McCarthy coined the phrase “Artificial Intelligence” (in a 1955 proposal for a 1956 Dartmouth conference), founded the first A.I. Lab (at MIT in 1958), and created LISP (List Processing Language), the computer language most commonly used in AI research.

In 1961, McCarthy was the first to imagine and propose a future in which computers could be shared by multiple users, and computing could be provided as a utility.  The idea was popular in the late 60s, then waned in the 70s, as it became clear that hardware and software weren’t (yet) up to the task.  But with the new millennium, McCarthy’s concept retook the fore– and in the last few years, rose to The Cloud…

source

 

%d bloggers like this: