Posts Tagged ‘computer graphics’
“I’m a little tea pot / Short and stout”*…

The fascinating story of the “Utah teapot,” the ur-object in the development of computer graphics…
This unassuming object—the “Utah teapot,” as it’s affectionately known—has had an enormous influence on the history of computing, dating back to 1974, when computer scientist Martin Newell was a Ph.D. student at the University of Utah.
The U of U was a powerhouse of computer graphics research then, and Newell had some novel ideas for algorithms that could realistically display 3D shapes—rendering complex effects like shadows, reflective textures, or rotations that reveal obscured surfaces. But, to his chagrin, he struggled to find a digitized object worthy of his methods. Objects that were typically used for simulating reflections, like a chess pawn, a donut, and an urn, were too simple.
One day over tea, Newell told his wife Sandra that he needed more interesting models. Sandra suggested that he digitize the shapes of the tea service they were using, a simple Melitta set from a local department store. It was an auspicious choice: The curves, handle, lid, and spout of the teapot all conspired to make it an ideal object for graphical experiment. Unlike other objects, the teapot could, for instance, cast a shadow on itself in several places. Newell grabbed some graph paper and a pencil, and sketched it.
Back in his lab, he entered the sketched coordinates—called Bézier control points, first used in the design of automobile bodies—on a Tektronix storage tube, an early text and graphics computer terminal. The result was a lovely virtual teapot, more versatile (and probably cuter) than any 3D model to date.
The new model was particularly appealing to Newell’s colleague, Jim Blinn [of whom Ivan Sutherland, the head of the program at Utah and a computer graphics pioneer said, “There are about a dozen great computer graphics people and Jim Blinn is six of them”]. One day, demonstrating how his software could adjust an object’s height, Blinn flattened the teapot a bit, and decided he liked the look of that version better. The distinctive Utah teapot was born.
The computer model proved useful for Newell’s own research, featuring prominently in his next few publications. But he and Blinn also took the important step of sharing their model publicly. As it turned out, other researchers were also starved for interesting 3D models, and the digital teapot was exactly the experimental test bed they needed. At the same time, the shape was simple enough for Newell to input and for computers to process. (Rumor has it some researchers even had the data points memorized!) And unlike many household items, like furniture or fruit-in-a-bowl, the teapot’s simulated surface looked realistic without superimposing an artificial, textured pattern.
The teapot quickly became a beloved staple of the graphics community. Teapot after teapot graced the pages and covers of computer graphics journals. “Anyone with a new idea about rendering and lighting would announce it by first trying it out on a teapot,” writes animator Tom Sito in Moving Innovation...
These days, the Utah teapot has achieved legendary status. It’s a built-in shape in many 3D graphics software packages used for testing, benchmarking, and demonstration. Graphics geeks like to sneak it into scenes and games as an in-joke, an homage to their countless hours of rendering teapots; hence its appearances in Windows, Toy Story, and The Simpsons…
Over the past few years, the teapot has been 3D printed back into the physical world, both as a trinket and as actual china. Pixar even made its own music video in honor of the teapot, titled “This Teapot’s Made for Walking,” and a teapot wind-up toy as a promotion for its Renderman software.
Newell has jokingly lamented that, despite all his algorithmic innovations, he’ll be remembered primarily for “that damned teapot.” But as much as computer scientists try to prove their chops by inventing clever algorithms, test beds for experimentation often leave a bigger mark. Newell essentially designed the model organism of computer graphics: to graphics researchers as lab mice are to biologists.
For the rest of us the humble teapot serves as a reminder that, in the right hands, something simple can become an icon of creativity and hidden potential…
How a humble serving piece shaped a technological domain: “The Most Important Object In Computer Graphics History Is This Teapot,” from Jesse Dunietz (@jdunietz)
* from “I’m a Little Tea Pot,” a 1939 novelty song by George Harold Sanders and Clarence Z. Kelley
###
As we muse on models, we might send foundational birthday greetings to Michael Faraday; he was born on this date in 1791. One of the great experimental scientists of all time, Faraday made huge contributions to the study of electromagnetism and electrochemistry.
Although Faraday received little formal education, he was one of the most influential scientists in history. It was by his research on the magnetic field around a conductor carrying a direct current that Faraday established the basis for the concept of the electromagnetic field in physics. Faraday also established that magnetism could affect rays of light and that there was an underlying relationship between the two phenomena. He similarly discovered the principles of electromagnetic induction and diamagnetism, and the laws of electrolysis. His inventions of electromagnetic rotary devices formed the foundation of electric motor technology, and it was largely due to his efforts that electricity became practical for use in technology [including, of course, computing and computer graphics].
As a chemist, Faraday discovered benzene, investigated the clathrate hydrate of chlorine, invented an early form of the Bunsen burner and the system of oxidation numbers, and popularised terminology such as “anode“, “cathode“, “electrode” and “ion“. Faraday ultimately became the first and foremost Fullerian Professor of Chemistry at the Royal Institution, a lifetime position.
Faraday was an excellent experimentalist who conveyed his ideas in clear and simple language; his mathematical abilities, however, did not extend as far as trigonometry and were limited to the simplest algebra. James Clerk Maxwell took the work of Faraday and others and summarized it in a set of equations which is accepted as the basis of all modern theories of electromagnetic phenomena. On Faraday’s uses of lines of force, Maxwell wrote that they show Faraday “to have been in reality a mathematician of a very high order – one from whom the mathematicians of the future may derive valuable and fertile methods.”…
Albert Einstein kept a picture of Faraday on his study wall, alongside pictures of Arthur Schopenhauer and James Clerk Maxwell. Physicist Ernest Rutherford stated, “When we consider the magnitude and extent of his discoveries and their influence on the progress of science and of industry, there is no honour too great to pay to the memory of Faraday, one of the greatest scientific discoverers of all time.”
Wikipedia
“We often plough so much energy into the big picture, we forget the pixels”*…
Alvy Ray Smith (see also here) was born before computers, made his first computer graphic in 1964, cofounded Pixar, was the first director of computer graphics at Lucasfilm, and the first graphics fellow at Microsoft. He is the author of the terrific new book A Biography of the Pixel (2021), from which, this excerpt…
I have billions of pixels in my cellphone, and you probably do too. But what is a pixel? Why do so many people think that pixels are little abutting squares? Now that we’re aswim in an ocean of zettapixels (21 zeros), it’s time to understand what they are. The underlying idea – a repackaging of infinity – is subtle and beautiful. Far from being squares or dots that ‘sort of’ approximate a smooth visual scene, pixels are the profound and exact concept at the heart of all the images that surround us – the elementary particles of modern pictures.
This brief history of the pixel begins with Joseph Fourier in the French Revolution and ends in the year 2000 – the recent millennium. I strip away the usual mathematical baggage that hides the pixel from ordinary view, and then present a way of looking at what it has wrought.
The millennium is a suitable endpoint because it marked what’s called the great digital convergence, an immense but uncelebrated event, when all the old analogue media types coalesced into the one digital medium. The era of digital light – all pictures, for whatever purposes, made of pixels – thus quietly began. It’s a vast field: books, movies, television, electronic games, cellphones displays, app interfaces, virtual reality, weather satellite images, Mars rover pictures – to mention a few categories – even parking meters and dashboards. Nearly all pictures in the world today are digital light, including nearly all the printed words. In fact, because of the digital explosion, this includes nearly all the pictures ever made. Art museums and kindergartens are among the few remaining analogue bastions, where pictures fashioned from old media can reliably be found…
An exact mathematical concept, pixels are the elementary particles of pictures, based on a subtle unpacking of infinity: “Pixel: a biography,” from @alvyray.
###
As we ruminate on resolution, we might recall that it was on this date in 1947 that fabled computer scientist Grace Hopper (see here and here), then a programmer at Harvard’s Harvard’s Mark II Aiken Relay computer, found and documented the first computer “bug”– an insect that had lodged in the works. The incident is recorded in Hopper’s logbook alongside the offending moth, taped to the logbook page: “15:45 Relay #70 Panel F (moth) in relay. First actual case of bug being found.”
This anecdote has led to Hopper being pretty widely credited with coining the term “bug” (and ultimately “de-bug”) in its technological usage… but the term actually dates back at least to Thomas Edison…

“Last time I checked, the digital universe was expanding at the rate of five trillion bits per second in storage”*…

… and rising. Happily, technologists are keeping up:
The intricate arrangement of base pairs in our DNA encodes just about everything about us. Now, DNA contains the entirety of “The Wonderful Wizard of Oz” as well.
A team of University of Texas Austin scientists just vastly improved the storage capacity of DNA and managed to encode the entire novel — translated into the geek-friendly language of Esperanto — in a double strand of DNA far more efficiently than has been done before. DNA storage isn’t new, but this work could help finally make it practical…
The full story at “Scientists Stored “The Wizard of Oz” on a Strand of DNA.” The UT release, here.
* George Dyson
###
As we reconsider Kondo, we might send carefully-stowed birthday greetings to Jay Wright Forrester; he was born on this date in 1914. A pioneering computer engineer and systems scientist, he was one of the inventors of magnetic core memory, the predominant form of random-access computer memory during the most explosive years of digital computer development (between 1955 and 1975). It was part of a family of related technologies which bridged the gap between vacuum tubes and semiconductors by exploiting the magnetic properties of materials to perform switching and amplification.
And close to your correspondent’s heart, Forrester is also believed to have created the first animation in the history of computer graphics, a “jumping ball” on an oscilloscope.



You must be logged in to post a comment.