(Roughly) Daily

Posts Tagged ‘history of science

“I’m a little tea pot / Short and stout”*…

The original Utah teapot, currently on display at the Computer History Museum in Mountain View, California.

The fascinating story of the “Utah teapot,” the ur-object in the development of computer graphics…

This unassuming object—the “Utah teapot,” as it’s affectionately known—has had an enormous influence on the history of computing, dating back to 1974, when computer scientist Martin Newell was a Ph.D. student at the University of Utah.

The U of U was a powerhouse of computer graphics research then, and Newell had some novel ideas for algorithms that could realistically display 3D shapes—rendering complex effects like shadows, reflective textures, or rotations that reveal obscured surfaces. But, to his chagrin, he struggled to find a digitized object worthy of his methods. Objects that were typically used for simulating reflections, like a chess pawn, a donut, and an urn, were too simple.

One day over tea, Newell told his wife Sandra that he needed more interesting models. Sandra suggested that he digitize the shapes of the tea service they were using, a simple Melitta set from a local department store. It was an auspicious choice: The curves, handle, lid, and spout of the teapot all conspired to make it an ideal object for graphical experiment. Unlike other objects, the teapot could, for instance, cast a shadow on itself in several places. Newell grabbed some graph paper and a pencil, and sketched it.

Back in his lab, he entered the sketched coordinates—called Bézier control points, first used in the design of automobile bodies—on a Tektronix storage tube, an early text and graphics computer terminal. The result was a lovely virtual teapot, more versatile (and probably cuter) than any 3D model to date.

The new model was particularly appealing to Newell’s colleague, Jim Blinn [of whom Ivan Sutherland, the head of the program at Utah and a computer graphics pioneer said, “There are about a dozen great computer graphics people and Jim Blinn is six of them”]. One day, demonstrating how his software could adjust an object’s height, Blinn flattened the teapot a bit, and decided he liked the look of that version better. The distinctive Utah teapot was born.

The computer model proved useful for Newell’s own research, featuring prominently in his next few publications. But he and Blinn also took the important step of sharing their model publicly. As it turned out, other researchers were also starved for interesting 3D models, and the digital teapot was exactly the experimental test bed they needed. At the same time, the shape was simple enough for Newell to input and for computers to process. (Rumor has it some researchers even had the data points memorized!) And unlike many household items, like furniture or fruit-in-a-bowl, the teapot’s simulated surface looked realistic without superimposing an artificial, textured pattern.

The teapot quickly became a beloved staple of the graphics community. Teapot after teapot graced the pages and covers of computer graphics journals.  “Anyone with a new idea about rendering and lighting would announce it by first trying it out on a teapot,” writes animator Tom Sito in Moving Innovation...

These days, the Utah teapot has achieved legendary status. It’s a built-in shape in many 3D graphics software packages used for testing, benchmarking, and demonstration. Graphics geeks like to sneak it into scenes and games as an in-joke, an homage to their countless hours of rendering teapots; hence its appearances in Windows, Toy Story, and The Simpsons

Over the past few years, the teapot has been 3D printed back into the physical world, both as a trinket and as actual china. Pixar even made its own music video in honor of the teapot, titled “This Teapot’s Made for Walking,” and a teapot wind-up toy as a promotion for its Renderman software.

Newell has jokingly lamented that, despite all his algorithmic innovations, he’ll be remembered primarily for “that damned teapot.” But as much as computer scientists try to prove their chops by inventing clever algorithms, test beds for experimentation often leave a bigger mark. Newell essentially designed the model organism of computer graphics: to graphics researchers as lab mice are to biologists.

For the rest of us the humble teapot serves as a reminder that, in the right hands, something simple can become an icon of creativity and hidden potential…

How a humble serving piece shaped a technological domain: “The Most Important Object In Computer Graphics History Is This Teapot,” from Jesse Dunietz (@jdunietz)

* from “I’m a Little Tea Pot,” a 1939 novelty song by George Harold Sanders and Clarence Z. Kelley


As we muse on models, we might send foundational birthday greetings to Michael Faraday; he was born on this date in 1791. One of the great experimental scientists of all time, Faraday made huge contributions to the study of electromagnetism and electrochemistry.

Although Faraday received little formal education, he was one of the most influential scientists in history. It was by his research on the magnetic field around a conductor carrying a direct current that Faraday established the basis for the concept of the electromagnetic field in physics. Faraday also established that magnetism could affect rays of light and that there was an underlying relationship between the two phenomena. He similarly discovered the principles of electromagnetic induction and diamagnetism, and the laws of electrolysis. His inventions of electromagnetic rotary devices formed the foundation of electric motor technology, and it was largely due to his efforts that electricity became practical for use in technology [including, of course, computing and computer graphics].

As a chemist, Faraday discovered benzene, investigated the clathrate hydrate of chlorine, invented an early form of the Bunsen burner and the system of oxidation numbers, and popularised terminology such as “anode“, “cathode“, “electrode” and “ion“. Faraday ultimately became the first and foremost Fullerian Professor of Chemistry at the Royal Institution, a lifetime position.

Faraday was an excellent experimentalist who conveyed his ideas in clear and simple language; his mathematical abilities, however, did not extend as far as trigonometry and were limited to the simplest algebra. James Clerk Maxwell took the work of Faraday and others and summarized it in a set of equations which is accepted as the basis of all modern theories of electromagnetic phenomena. On Faraday’s uses of lines of force, Maxwell wrote that they show Faraday “to have been in reality a mathematician of a very high order – one from whom the mathematicians of the future may derive valuable and fertile methods.”…

Albert Einstein kept a picture of Faraday on his study wall, alongside pictures of Arthur Schopenhauer and James Clerk Maxwell. Physicist Ernest Rutherford stated, “When we consider the magnitude and extent of his discoveries and their influence on the progress of science and of industry, there is no honour too great to pay to the memory of Faraday, one of the greatest scientific discoverers of all time.”



“Supersymmetry was (and is) a beautiful mathematical idea. The problem with applying supersymmetry is that it is too good for this world.”*…

Physicists reconsider their options…

A wise proverb suggests not putting all your eggs in one basket. Over recent decades, however, physicists have failed to follow that wisdom. The 20th century—and, indeed, the 19th before it—were periods of triumph for them. They transformed understanding of the material universe and thus people’s ability to manipulate the world around them. Modernity could not exist without the knowledge won by physicists over those two centuries.

In exchange, the world has given them expensive toys to play with. The most recent of these, the Large Hadron Collider (LHC), which occupies a 27km-circumference tunnel near Geneva and cost $6bn, opened for business in 2008. It quickly found a long-predicted elementary particle, the Higgs boson, that was a hangover from calculations done in the 1960s. It then embarked on its real purpose, to search for a phenomenon called Supersymmetry.

This theory, devised in the 1970s and known as Susy for short, is the all-containing basket into which particle physics’s eggs have until recently been placed. Of itself, it would eliminate many arbitrary mathematical assumptions needed for the proper working of what is known as the Standard Model of particle physics. But it is also the vanguard of a deeper hypothesis, string theory, which is intended to synthesise the Standard Model with Einstein’s general theory of relativity. Einstein’s theory explains gravity. The Standard Model explains the other three fundamental forces—electromagnetism and the weak and strong nuclear forces—and their associated particles. Both describe their particular provinces of reality well. But they do not connect together. String theory would connect them, and thus provide a so-called “theory of everything”.

String theory proposes that the universe is composed of minuscule objects which vibrate in the manner of the strings of a musical instrument. Like such strings, they have resonant frequencies and harmonics. These various vibrational modes, string theorists contend, correspond to various fundamental particles. Such particles include all of those already observed as part of the Standard Model, the further particles predicted by Susy, which posits that the Standard Model’s mathematical fragility will go away if each of that model’s particles has a heavier “supersymmetric” partner particle, or “sparticle”, and also particles called gravitons, which are needed to tie the force of gravity into any unified theory, but are not predicted by relativity.

But, no Susy, no string theory. And, 13 years after the LHC opened, no sparticles have shown up. Even two as-yet-unexplained results announced earlier this year (one from the LHC and one from a smaller machine) offer no evidence directly supporting Susy. Many physicists thus worry they have been on a wild-goose chase…

Bye, bye little Susy? Supersymmetry isn’t (so far, anyway) proving out; and prospects look dim. But a similar fallow period in physics led to quantum theory and relativity: “Physics seeks the future.”

Frank Wilczek


As we ponder paradigms, we might send insightful birthday greetings to Friedrich Wilhelm Ostwald; he was born on this date in 1853. A chemist and philosopher, he made many specific contributions to his field (including advances on atomic theory), and was one of the founders of the of the field of physical chemistry. He won the Nobel Prize in 1909.

Following his retirement in 1906 from academic life, Ostwald became involved in philosophy, art, and politics– to each of which he made significant contributions.


“One cannot walk down an avenue, converse with a friend, enter a building, browse beneath the sandstone arches of an old arcade without meeting an instrument of time.”*…

Frieze on the Tower of the Winds in Athens, an early public clock

Time has ordered human life for millennia….

The Tower of the Winds, in the Greek city of Athens… is one of the best-​preserved buildings from the ancient world. This octagonal marble tower, sited close to a busy marketplace at the foot of the hill of the famous Acropolis, rises forty-​two feet into the air and measures twenty-​six feet across, and it was an astonishing sight for the people of this crowded and vibrant city. The external walls were covered in brightly colored reliefs and moldings representing the eight winds, with each of the eight walls, and a semi-​circular annex, carrying a sundial. Inside the ceiling was painted a stunning blue color covered with golden stars. At the center of the imposing interior was a water clock, which was fed from a sacred source high up on the hill of the Acropolis called the Clepsydra, a name which became synonymous with all water clocks. The clock is believed once to have driven a complex mechanical model of the heavens themselves, like a planetarium, orrery, or armillary sphere.

Nobody is quite sure when the Tower of the Winds was built, but it was probably about 140 bc. As with the sundial at the Roman Forum, we can think of it as an early public clock tower, giving Athenians the time of day as they went about their daily business at the market and elsewhere, and giving order to their lives. It was also symbolic of a wider order. The gods of the winds, depicted on its decorative panels, were allegories of world order; the stars inside, together with the water clock and its mechanical replica of the heavens, were symbolic of a cosmic order. Certainly, it was an astonishing spectacle.

But, also like the sundial proudly installed by Valerius in Rome, the Tower of the Winds may have carried a further message. If, as some historians believe, the structure was built by Attalos II, king of the Greek city of Pergamon, to commemorate the Athenian defeat of the Persian Navy in 480 bc, then it could serve as a vivid peacetime reminder of the military strength of the state—​and the discipline needed to maintain it…

In empires around the world, the sight and sound of time from high towers had begun to organize the lives of the people, and project a message of power and order.

It is tempting, in the twenty-​first century, to feel that we are the first generation to resent being governed by the clock as we go about our daily lives; that we are no longer in control of what we do and when we do it because we must follow the clock’s orders. During our long warehouse shifts, sitting at our factory workstations, or enduring seemingly never-​ending meetings at the office, we might grumble that the morning is dragging on, but we cannot eat because the clock has not yet got around to lunchtime. But these feelings are nothing new. In fact, while the public sundial was new to Romans in 263 bc, it had been in widespread use long before that in other cities around the world; the first water clocks date back even further than sundials, more than 3,500 years to ancient Babylon and Egypt.

It is easy to think that public clocks are an inevitable feature of our lives. But by looking more closely at their history, we can understand better what they used to mean—​and why they were built in the first place. Because wherever we are, as far back as we care to look, we can find that monumental timekeepers mounted high up on towers or public buildings have been put there to keep us in order, in a world of violent disorder.

Public time has been on the march for thousands of years: “Monumental Timekeepers,” an except from David Rooney‘s (@rooneyvision) About Time- A History of Civilization in Twelve Clocks. Via @longnow.

* Alan Lightman


As we watch the clock, we might send timely birthday greetings to George Alfred Leon Sarton; he died on this date in 1956. A chemist by training, his primary interest lay in the past practices and precepts of his field…an interest that led him to found the discipline of the history of science as an independent field of study. His most influential work was the Introduction to the History of Science (three volumes totaling 4,296 pages), which effectively founded that discipline. Sarton ultimately aimed to achieve an integrated philosophy of science that connected the sciences and the humanities– what he called “the new humanism.” His name is honored with the prestigious George Sarton Medal, awarded by the History of Science Society.


“Consciousness was upon him before he could get out of the way”*…

Some scientists, when looking at the ladder of nature, find no clear line between mind and no-mind…

Last year, the cover of New Scientist ran the headline, “Is the Universe Conscious?” Mathematician and physicist Johannes Kleiner, at the Munich Center for Mathematical Philosophy in Germany, told author Michael Brooks that a mathematically precise definition of consciousness could mean that the cosmos is suffused with subjective experience. “This could be the beginning of a scientific revolution,” Kleiner said, referring to research he and others have been conducting. 

Kleiner and his colleagues are focused on the Integrated Information Theory of consciousness, one of the more prominent theories of consciousness today. As Kleiner notes, IIT (as the theory is known) is thoroughly panpsychist because all integrated information has at least one bit of consciousness.

You might see the rise of panpsychism as part of a Copernican trend—the idea that we’re not special. The Earth is not the center of the universe. Humans are not a treasured creation, or even the pinnacle of evolution. So why should we think that creatures with brains, like us, are the sole bearers of consciousness? In fact, panpsychism has been around for thousands of years as one of various solutions to the mind-body problem. David Skrbina’s 2007 book, Panpsychism in the West, provides an excellent history of this intellectual tradition.

While there are many versions of panpsychism, the version I find appealing is known as constitutive panpsychism. It states, to put it simply, that all matter has some associated mind or consciousness, and vice versa. Where there is mind there is matter and where there is matter there is mind. They go together. As modern panpsychists like Alfred North Whitehead, David Ray Griffin, Galen Strawson, and others have argued, all matter has some capacity for feeling, albeit highly rudimentary feeling in most configurations of matter. 

While inanimate matter doesn’t evolve like animate matter, inanimate matter does behave. It does things. It responds to forces. Electrons move in certain ways that differ under different experimental conditions. These types of behaviors have prompted respected physicists to suggest that electrons may have some type of extremely rudimentary mind. For example the late Freeman Dyson, the well-known American physicist, stated in his 1979 book, Disturbing the Universe, that “the processes of human consciousness differ only in degree but not in kind from the processes of choice between quantum states which we call ‘chance’ when made by electrons.” Quantum chance is better framed as quantum choice—choice, not chance, at every level of nature. David Bohm, another well-known American physicist, argued similarly: “The ability of form to be active is the most characteristic feature of mind, and we have something that is mind-like already with the electron.”

Many biologists and philosophers have recognized that there is no hard line between animate and inanimate. J.B.S. Haldane, the eminent British biologist, supported the view that there is no clear demarcation line between what is alive and what is not: “We do not find obvious evidence of life or mind in so-called inert matter…; but if the scientific point of view is correct, we shall ultimately find them, at least in rudimentary form, all through the universe.”…

Electrons May Very Well Be Conscious“: Tam Hunt (@TamHunt) explains.

* Kingsley Amis


As we challenge (chauvinistic?) conventions, we might spare a thought for a man who was no great respecter of consciousness, B. F. Skinner; he died on this date in 1990. A psychologist, he was the pioneer and champion of what he called “radical behaviorism,” the assumption that behavior is a consequence of environmental histories of “reinforcement” (reactions to positive and negative stimuli):

What is felt or introspectively observed is not some nonphysical world of consciousness, mind, or mental life but the observer’s own body. This does not mean, as I shall show later, that introspection is a kind of psychological research, nor does it mean (and this is the heart of the argument) that what are felt or introspectively observed are the causes of the behavior. An organism behaves as it does because of its current structure, but most of this is out of reach of introspection.

About Behaviorism

Building on the work of Ivan Pavlov and John B. Watson, Skinner used operant conditioning to strengthen behavior, considering the rate of response to be the most effective measure of response strength. To study operant conditioning, he invented the operant conditioning chamber (aka the Skinner box).

C.F. also: Thomas Pynchon’s Gravity’s Rainbow.


“The most incomprehensible thing about the world is that it is at all comprehensible”*…

There is an order to the ordered search for ordered understanding…

Science (from Latin scientia, meaning “knowledge”) is a systematic enterprise that builds and organizes knowledge in the form of testable explanations and predictions about the universe.

Modern science is typically divided into three major branches that consist of the natural sciences (biology, chemistry, physics, astronomy and Earth science), which study nature in the broadest sense; the social sciences (e.g. psychology, sociology, economics, history) which study people and societies; and the formal sciences (e.g. mathematics, logic, theoretical computer science), which study abstract concepts. There is disagreement, however, on the formal sciences being a science [as they use an a priori, as opposed to empirical, methodology]. Disciplines that use science, such as engineering and medicine, are described as applied sciences

And there is a dazzling array of “branches” of the scientific endeavor:

Acanthochronology – study of cactus spines grown in time ordered sequence

Acarology – study of mites and ticks

Aceology – science of remedies, or of therapeutics; iamatology

Acology – study of medical remedies

Acoustics – science of sound

Actinobiology – synonymous with radiobiologyAdenology – study of glands…

Browse dozens and dozens at “Index of branches of science,” from Wikipedia… whose contributors may be erring on the generous side, as the list includes such entries as “Hamartiology” (the study of sin) and “Taxidermy” (the art of curing and stuffing animals).

* Albert Einstein


As we tackle taxonomy, we might recall that it was on this date in 2013 that Google experienced a five-minute outage affecting all of it’s services, including Google Search, YouTube, and Google Drive. During that brief period global internet traffic dropped 40%.

%d bloggers like this: