(Roughly) Daily

Posts Tagged ‘John McCarthy

“The greatest obstacle to discovery is not ignorance – it is the illusion of knowledge”*…

Linnaeus and Buffon

Learning from the past: as John Thornhill explains in his consideration of Jason RobertsEvery Living Thing, the rivalry between Buffon and Linnaeus has lessons about disrupters and exploitation…

The aristocratic French polymath Georges-Louis Leclerc, Comte de Buffon chose a good year to die: 1788. Reflecting his status as a star of the Enlightenment and author of 35 popular volumes on natural history, Buffon’s funeral carriage drawn by 14 horses was watched by an estimated 20,000 mourners as it processed through Paris. A grateful Louis XVI had earlier erected a statue of a heroic Buffon in the Jardin du Roi, over which the naturalist had masterfully presided. “All nature bows to his genius,” the inscription read.

The next year the French Revolution erupted. As a symbol of the ancien regime, Buffon was denounced as an enemy of progress, his estates in Burgundy seized, and his son, known as the Buffonet, guillotined. In further insult to his memory, zealous revolutionaries marched through the king’s gardens (nowadays known as the Jardin des Plantes) with a bust of Buffon’s great rival, Carl Linnaeus. They hailed the Swedish scientific revolutionary as a true man of the people.

The intense intellectual rivalry between Buffon and Linnaeus, which still resonates today, is fascinatingly told by the author Jason Roberts in his book Every Living Thing, my holiday reading while staying near Buffon’s birthplace in Burgundy. Natural history, like all history, might be written by the victors, as Roberts argues. And for a long time, Linnaeus’s highly influential, but flawed, views held sway. But the book makes a sympathetic case for the further rehabilitation of the much-maligned Buffon.

The two men were, as Roberts writes, exact contemporaries and polar opposites. While Linnaeus obsessed about classifying all biological species into neat categories with fixed attributes and Latin names (Homo sapiens, for example), Buffon emphasised the vast diversity and constantly changing nature of every living thing.

In Roberts’s telling, Linnaeus emerges as a brilliant but ruthless dogmatist, who ignored inconvenient facts that did not fit his theories and gave birth to racial pseudoscience. But it was Buffon’s painstaking investigations and acceptance of complexity that helped inspire the evolutionary theories of Charles Darwin, who later acknowledged that the Frenchman’s ideas were “laughably like mine”.

In two aspects, at least, this 18th-century scientific clash rhymes with our times. The first is to show how intellectual knowledge can often be a source of financial gain. The discovery of crops and commodities in other parts of the world and the development of new methods of cultivation had a huge impact on the economy in that era. “All that is useful to man originates from these natural objects,” Linnaeus wrote. “In one word, it is the foundation of every industry.”

Great wealth was generated from trade in sugar, potatoes, coffee, tea and cochineal while Linnaeus himself explored ways of cultivating pineapples, strawberries and freshwater pearls.

“In many ways, the discipline of natural history in the 18th century was roughly analogous to technology today: a means of disrupting old markets, creating new ones, and generating fortunes in the process,” Roberts writes. As a former software engineer at Apple and a West Coast resident, Roberts knows the tech industry.

Then as now, the addition of fresh inputs into the economy — whether natural commodities back then or digital data today — can lead to astonishing progress, benefiting millions. But it can also lead to exploitation. As Roberts tells me in a telephone interview, it was the scaling up of the sugar industry in the West Indies that led to the slave trade. “Sometimes we think we are inventing the future when we are retrofitting the past,” he says.

The second resonance with today is the danger of believing we know more than we do. Roberts compares Buffon’s state of “curious unknowing” to the concept of “negative capability” described by the English poet John Keats. In a letter written in 1817, Keats argued that we should resist the temptation to explain away things we do not properly understand and accept “uncertainties, mysteries, doubts, without any irritable reaching after fact and reason.”

Armed today with instant access to information and smart machines, the temptation is to ascribe a rational order to everything, as Linnaeus did. But scientific progress depends on a humble acceptance of relative ignorance and a relentless study of the fabric of reality. The spooky nature of quantum mechanics would have blown Linnaeus’s mind. If Buffon still teaches us anything, it is to study the peculiarity of things as they are, not as we might wish them to be…

What an epic 18th-century scientific row teaches us today,” @johnthornhillft on @itsJason in @FT (gift link)

Pair with “Frameworks” from Céline Henne (@celinehenne) “Knowledge is often a matter of discovery. But when the nature of an enquiry itself is at question, it is an act of creation.”

* Daniel J. Boorstin

###

As we embrace the exceptions, we might send carefully-coded birthday greetings to John McCarthy; he was born on this date in 1927.  An eminent computer and cognitive scientist– he was awarded both the Turning Prize and the National Medal of Science– McCarthy coined the phrase “artificial Intelligence” to describe the field of which he was a founder.

It was McCarthy’s 1979 article, “Ascribing Mental Qualities to Machines” (in which he wrote, “Machines as simple as thermostats can be said to have beliefs, and having beliefs seems to be a characteristic of most machines capable of problem solving performance”) that provoked John Searle‘s 1980 disagreement in the form of his famous Chinese Room Argument… provoking a broad debate that continues to this day.

 source

“It was orderly, like the universe. It had logic. It was dependable. Using it allowed a kind of moral uplift, as one’s own chaos was also brought under control.”*…

(Roughly) Daily has looked before at the history of the filing cabinet, rooted in the work of Craig Robertson (@craig2robertson). He has deepened his research and published a new book, The Filing Cabinet: A Vertical History of Information. An Xiao Mina offers an appreciation– and a consideration of one of the central questions it raises: can emergent knowledge coexist with an internet that privileges the kind “certainty” that’s implicit in the filing paradigm that was born with the filing cabinet and that informs our “knowledge systems” today…

… The 20th century saw an emergent information paradigm shaped by corporate capitalism, which emphasized maximizing profit and minimizing the time workers spent on tasks. Offices once kept their information in books—think Ebenezer Scrooge with his quill pen, updating his thick ledger on Christmas. The filing cabinet changed all that, encouraging what Robertson calls “granular certainty,” or “the drive to break more and more of life and its everyday routines into discrete, observable, and manageable parts.” This represented an important conceptualization: Information became a practical unit of knowledge that could be standardized, classified, and effortlessly stored and retrieved.

Take medical records, which require multiple layers of organization to support routine hospital business. “At the Bryn Mawr Hospital,” Robertson writes, “six different card files provided access to patient information: an alphabetical file of admission cards for discharged patients, an alphabetical file for the accident ward, a file to record all operations, a disease file, a diagnostic file, and a doctors’ file that recorded the number of patients each physician referred to the hospital.” The underlying logic of this system was that the storage of medical records didn’t just keep them safe; it made sure that those records could be accessed easily.

Robertson’s deep focus on the filing cabinet grounds the book in history and not historical analogy. He touches very little on Big Data and indexing and instead dives into the materiality of the filing cabinet and the principles of information management that guided its evolution. But students of technology and information studies will immediately see this history shaping our world today…

[And] if the filing cabinet, as a tool of business and capital, guides how we access digital information today, its legacy of certainty overshadows the messiness intrinsic to acquiring knowledge—the sort that requires reflection, contextualization, and good-faith debate. Ask the internet difficult questions with complex answers—questions of philosophy, political science, aesthetics, perception—and you’ll get responses using the same neat little index cards with summaries of findings. What makes for an ethical way of life? What is the best English-language translation of the poetry of Borges? What are the long-term effects of social inequalities, and how do we resolve them? Is it Yanny or Laurel?

Information collection and distribution today tends to follow the rigidity of cabinet logic to its natural extreme, but that bias leaves unattended more complex puzzles. The human condition inherently demands a degree of comfort with uncertainty and ambiguity, as we carefully balance incomplete and conflicting data points, competing value systems, and intricate frameworks to arrive at some form of knowing. In that sense, the filing cabinet, despite its deep roots in our contemporary information architecture, is just one step in our epistemological journey, not its end…

A captivating new history helps us see a humble appliance’s sweeping influence on modern life: “The Logic of the Filing Cabinet Is Everywhere.”

* Jeanette Winterson, Why Be Happy When You Could Be Normal?

###

As we store and retrieve, we might recall that it was on this date in 19955 that the term “artificial intelligence” was coined in a proposal for a “2 month, 10 man study of artificial intelligence” submitted by John McCarthy (Dartmouth College), Marvin Minsky (Harvard University), Nathaniel Rochester (IBM), and Claude Shannon (Bell Telephone Laboratories). The workshop, which took place at Dartmouth a year later, in July and August 1956, is generally recognized as the official birth date of the new field. 

Dartmouth Conference attendees: Marvin Minsky, Claude Shannon, Ray Solomonoff and other scientists at the Dartmouth Summer Research Project on Artificial Intelligence (Photo: Margaret Minsky)

source

“God help us — for art is long, and life so short”*…

 

The creation of a homunculus, an artificially made miniature human, from an 1899 edition of Goethe’s Faust

Making life artificially wasn’t as big a deal for the ancients as it is for us. Anyone was supposed to be able to do it with the right recipe, just like baking bread. The Roman poet Virgil described a method for making synthetic bees, a practice known as bougonia, which involved beating a poor calf to death, blocking its nose and mouth, and leaving the carcass on a bed of thyme and cinnamon sticks. “Creatures fashioned wonderfully appear,” he wrote, “first void of limbs, but soon awhir with wings.”

This was, of course, simply an expression of the general belief in spontaneous generation: the idea that living things might arise from nothing within a fertile matrix of decaying matter. Roughly 300 years earlier, Aristotle, in his book On the Generation of Animals, explained how this process yielded vermin, such as insects and mice. No one doubted it was possible, and no one feared it either (apart from the inconvenience); one wasn’t “playing God” by making new life this way.

The furor that has sometimes accompanied the new science of synthetic biology—the attempt to reengineer living organisms as if they were machines for us to tinker with, or even to build them from scratch from the component parts—stems from a decidedly modern construct, a “reverence for life.” In the past, fears about this kind of technological hubris were reserved mostly for proposals to make humans by artificial means—or as the Greeks would have said, by techne, art…

Philip Ball digs into myth, history, and science to untangle the roots of our fears of artificial life: “Man Made: A History of Synthetic Life.”

* Johann Wolfgang von Goethe, Faust: Part One

###

As we marvel that “it’s alive!,” we might send carefully-coded birthday greetings to John McCarthy; he was born on this date in 1927.  An eminent computer and cognitive scientist– he was awarded both the Turning Prize and the National Medal of Science– McCarthy coined the phrase “artificial Intelligence” to describe the field of which he was a founder.

It was McCarthy’s 1979 article, “Ascribing Mental Qualities to Machines” (in which he wrote, “Machines as simple as thermostats can be said to have beliefs, and having beliefs seems to be a characteristic of most machines capable of problem solving performance”) that provoked John Searle‘s 1980 disagreement in the form of his famous Chinese Room Argument… provoking a broad debate that continues to this day.

 source

 

 

Written by (Roughly) Daily

September 4, 2016 at 1:01 am

Monsters of Grok…

 

From Amorphia Apparel:

Fake band t-shirts for history’s greatest minds:  I don’t know about you, but I think science and philosophy are pretty bad ass, so join me in rocking out with some the most influential thinkers of all time!

More sagacious shirts at Amorphia Apparel

 

As we opt for ring-spun wisdom, we might wish a thoughtful Happy Birthday to cognitive and computer scientist John McCarthy; he was born on this date in 1927.  A recipient of the Turing Award, McCarthy coined the phrase “Artificial Intelligence” (in a 1955 proposal for a 1956 Dartmouth conference), founded the first A.I. Lab (at MIT in 1958), and created LISP (List Processing Language), the computer language most commonly used in AI research.

In 1961, McCarthy was the first to imagine and propose a future in which computers could be shared by multiple users, and computing could be provided as a utility.  The idea was popular in the late 60s, then waned in the 70s, as it became clear that hardware and software weren’t (yet) up to the task.  But with the new millennium, McCarthy’s concept retook the fore– and in the last few years, rose to The Cloud…

source