(Roughly) Daily

Posts Tagged ‘computer

“The golden ratio is the key”*…

… in any case, to good design. So, how did it come into currency? Western tradition tends to credit the Greeks and Euclid (via Fibonacci), while acknowledging that they may have been inspired by the Egyptians. But recent research has surfaced a a more tantalizing prospect:

Design remains a largely white profession, with Black people still vastly underrepresented – making up just 3% of the design industry, according to a 2019 survey

Part of the lack of representation might have had to do with the fact that prevailing tenets of design seemed to hew closely to Western traditions, with purported origins in Ancient Greece and the schools out of Germany, Russia and the Netherlands deemed paragons of the field. A “Black aesthetic” has seemed to be altogether absent.

But what if a uniquely African aesthetic has been deeply embedded in Western design all along? 

Through my research collaboration with design scholar Ron Eglash, author of “African Fractals,” I discovered that the design style that undergirds much of the graphic design profession today – the Swiss design tradition that uses the golden ratio – may have roots in African culture

The golden ratio refers to the mathematical expression of “1: phi,” where phi is an irrational number, roughly 1.618. 

Visually, this ratio can be represented as the “golden rectangle,” with the ratio of side “a” to side “b” the same as the ratio of the sides “a”-plus-“b” to “a.” 

The golden rectangle. If you divide ‘a’ by ‘b’ and ‘a’-plus-‘b’ by ‘a,’ you get phi, which is roughly 1.618

Create a square on one side of the golden rectangle, and the remaining space will form another golden rectangle. Repeat that process in each new golden rectangle, subdividing in the same direction, and you’ll get a golden spiral [the image at the top of this post], arguably the more popular and recognizable representation of the golden ratio.

This ratio is called “golden” or “divine” because it’s visually pleasing, and some scholars argue that the human eye can more readily interpret images that incorporate it.

For these reasons, you’ll see the golden ratio, rectangle and spiral incorporated into the design of public spaces and emulated in the artwork in museum halls and hanging on gallery walls. It’s also reflected in naturearchitecture, and design – and it forms a key component of modern Swiss design.

The Swiss design style emerged in the 20th century from an amalgamation of Russian, Dutch and German aesthetics. It’s been called one of the most important movements in the history of graphic design and provided the foundation for the rise of modernist graphic design in North America.

The Helvetica font, which originated in Switzerland, and Swiss graphic compositions – from ads to book covers, web pages and posters – are often organized according to the golden rectangle. Swiss architect Le Corbusier famously centered his design philosophy on the golden ratio, which he described as “[resounding] in man by an organic inevitability.”

An ad for Swiss Air by graphic designer Josef Müller-Brockmann incorporates the golden ratio. Grafic Notes

Graphic design scholars – represented particularly by Greek architecture scholar Marcus Vitruvius Pollo – have tended to credit early Greek culture for incorporating the golden rectangle into design. They’ll point to the Parthenon as a notable example of a building that implemented the ratio in its construction.

But empirical measurements don’t support the Parthenon’s purported golden proportions, since its actual ratio is 4:9 – two whole numbers. As I’ve pointed out, the Greeks, notably the mathematician Euclid, were aware of the golden ratio, but it was mentioned only in the context of the relationship between two lines or figures. No Greek sources use the phrase “golden rectangle” or suggest its use in design.

In fact, ancient Greek writings on architecture almost always stress the importance of whole number ratios, not the golden ratio. To the Greeks, whole number ratios represented Platonic concepts of perfection, so it’s far more likely that the Parthenon would have been built in accordance with these ideals.

If not from the ancient Greeks, where, then, did the golden rectangle originate? 

In Africa, design practices tend to focus on bottom-up growth and organic, fractal forms. They are created in a sort of feedback loop, what computer scientists call “recursion.” You start with a basic shape and then divide it into smaller versions of itself, so that the subdivisions are embedded in the original shape. What emerges is called a “self-similar” pattern, because the whole can be found in the parts… 

Robert Bringhurst, author of the canonical work “The Elements of Typographic Style,” subtly hints at the golden ratio’s African origins:

“If we look for a numerical approximation to this ratio, 1: phi, we will find it in something called the Fibonacci series, named for the thirteenth-century mathematician Leonardo Fibonacci. Though he died two centuries before Gutenberg, Fibonacci is important in the history of European typography as well as mathematics. He was born in Pisa but studied in North Africa.”

These scaling patterns can be seen in ancient Egyptian design, and archaeological evidence shows that African cultural influences traveled down the Nile river. For instance, Egyptologist Alexander Badaway found the Fibonacci Series’ use in the layout of the Temple of Karnak. It is arranged in the same way African villages grow: starting with a sacred altar or “seed shape” before accumulating larger spaces that spiral outward.

Given that Fibonacci specifically traveled to North Africa to learn about mathematics, it is not unreasonable to speculate that Fibonacci brought the sequence from North Africa. Its first appearance in Europe is not in ancient Greece, but in “Liber Abaci,” Fibonacci’s book of math published in Italy in 1202. 

Why does all of this matter?

Well, in many ways, it doesn’t. We care about “who was first” only because we live in a system obsessed with proclaiming some people winners – the intellectual property owners that history should remember. That same system declares some people losers, removed from history and, subsequently, their lands, undeserving of any due reparations. 

Yet as many strive to live in a just, equitable and peaceful world, it is important to restore a more multicultural sense of intellectual history, particularly within graphic design’s canon. And once Black graphic design students see the influences of their predecessors, perhaps they will be inspired and motivated anew to recover that history – and continue to build upon its legacy.

The longer-than-we’ve-acknowledged history of the Golden Ratio in design; Audrey Bennett (@audreygbennett) unpacks “The African roots of Swiss design.”

For more on Fibonacci‘s acquisitive habits, see this earlier post.

* Sir Edward Victor Appleton, Nobel Laureate in physics (1947)


As we ruminate on relationships, we might send careful-calculated birthday greetings to Mary Jackson; she was born on this date in 1921. A mathematician and aerospace engineer, she worked at Langley Research Center in Hampton, Virginia (part of the National Advisory Committee for Aeronautics [NACA], which in 1958 was succeeded by the National Aeronautics and Space Administration [NASA]) for most of her career. She began as a “computer” at the segregated West Area Computing division in 1951; in 1958, she became NASA’s first black female engineer.

Jackson’s story features in the 2016 non-fiction book Hidden Figures: The American Dream and the Untold Story of the Black Women Who Helped Win the Space Race. She is one of the three protagonists in Hidden Figures, the film adaptation released the same year. In 2019, she was posthumously awarded the Congressional Gold Medal; in 2020 the Washington, D.C. headquarters of NASA was renamed the Mary W. Jackson NASA Headquarters.


“We know the past but cannot control it. We control the future but cannot know it.”*…

Readers will know of your correspondent’s fascination with– and admiration for– Claude Shannon

Within engineering and mathematics circles, Shannon is a revered figure. At 21 [in 1937], he published what’s been called the most important master’s thesis of all time, explaining how binary switches could do logic and laying the foundation for all future digital computers. At the age of 32, he published A Mathematical Theory of Communication, which Scientific American called “the Magna Carta of the information age.” Shannon’s masterwork invented the bit, or the objective measurement of information, and explained how digital codes could allow us to compress and send any message with perfect accuracy.

But Shannon wasn’t just a brilliant theoretical mind — he was a remarkably fun, practical, and inventive one as well. There are plenty of mathematicians and engineers who write great papers. There are fewer who, like Shannon, are also jugglers, unicyclists, gadgeteers, first-rate chess players, codebreakers, expert stock pickers, and amateur poets.

Shannon worked on the top-secret transatlantic phone line connecting FDR and Winston Churchill during World War II and co-built what was arguably the world’s first wearable computer. He learned to fly airplanes and played the jazz clarinet. He rigged up a false wall in his house that could rotate with the press of a button, and he once built a gadget whose only purpose when it was turned on was to open up, release a mechanical hand, and turn itself off. Oh, and he once had a photo spread in Vogue.

Think of him as a cross between Albert Einstein and the Dos Equis guy…

From Jimmy Soni (@jimmyasoni), co-author of A Mind At Play: How Claude Shannon Invented the Information Age: “11 Life Lessons From History’s Most Underrated Genius.”

* Claude Shannon


As we learn from the best, we might recall that it was on this date in 1946 that an early beneficiary of Shannon’s thinking, the ENIAC (Electronic Numerical Integrator And Computer), was first demonstrated in operation.  (It was announced to the public the following day.) The first general-purpose computer (Turing-complete, digital, and capable of being programmed and re-programmed to solve different problems), ENIAC was begun in 1943, as part of the U.S’s war effort (as a classified military project known as “Project PX”); it was conceived and designed by John Mauchly and Presper Eckert of the University of Pennsylvania, where it was built.  The finished machine, composed of 17,468 electronic vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints, weighed more than 27 tons and occupied a 30 x 50 foot room– in its time the largest single electronic apparatus in the world.  ENIAC’s basic clock speed was 100,000 cycles per second. Today’s home computers have clock speeds of 1,000,000,000 cycles per second.


“Moore’s Law is really a thing about human activity, it’s about vision, it’s about what you’re allowed to believe”*…




In moments of technological frustration, it helps to remember that a computer is basically a rock. That is its fundamental witchcraft, or ours: for all its processing power, the device that runs your life is just a complex arrangement of minerals animated by electricity and language. Smart rocks. The components are mined from the Earth at great cost, and they eventually return to the Earth, however poisoned. This rock-and-metal paradigm has mostly served us well. The miniaturization of metallic components onto wafers of silicon — an empirical trend we call Moore’s Law — has defined the last half-century of life on Earth, giving us wristwatch computers, pocket-sized satellites and enough raw computational power to model the climate, discover unknown molecules, and emulate human learning.

But there are limits to what a rock can do. Computer scientists have been predicting the end of Moore’s Law for decades. The cost of fabricating next-generation chips is growing more prohibitive the closer we draw to the physical limits of miniaturization. And there are only so many rocks left. Demand for the high-purity silica sand used to manufacture silicon chips is so high that we’re facing a global, and irreversible, sand shortage; and the supply chain for commonly-used minerals, like tin, tungsten, tantalum, and gold, fuels bloody conflicts all over the world. If we expect 21st century computers to process the ever-growing amounts of data our culture produces — and we expect them to do so sustainably — we will need to reimagine how computers are built. We may even need to reimagine what a computer is to begin with.

It’s tempting to believe that computing paradigms are set in stone, so to speak. But there are already alternatives on the horizon. Quantum computing, for one, would shift us from a realm of binary ones and zeroes to one of qubits, making computers drastically faster than we can currently imagine, and the impossible — like unbreakable cryptography — newly possible. Still further off are computer architectures rebuilt around a novel electronic component called a memristor. Speculatively proposed by the physicist Leon Chua in 1971, first proven to exist in 2008, a memristor is a resistor with memory, which makes it capable of retaining data without power. A computer built around memristors could turn off and on like a light switch. It wouldn’t require the conductive layer of silicon necessary for traditional resistors. This would open computing to new substrates — the possibility, even, of integrating computers into atomically thin nano-materials. But these are architectural changes, not material ones.

For material changes, we must look farther afield, to an organism that occurs naturally only in the most fleeting of places. We need to glimpse into the loamy rot of a felled tree in the woods of the Pacific Northwest, or examine the glistening walls of a damp cave. That’s where we may just find the answer to computing’s intractable rock problem: down there, among the slime molds…

It’s time to reimagine what a computer could be: “Beyond Smart Rocks.”

(TotH to Patrick Tanguay.)

* “Moore’s Law is really a thing about human activity, it’s about vision, it’s about what you’re allowed to believe. Because people are really limited by their beliefs, they limit themselves by what they allow themselves to believe about what is possible.”  – Carver Mead


As we celebrate slime, we might send fantastically far-sighted birthday greetings to Hugo Gernsback, a Luxemborgian-American inventor, broadcast pioneer, writer, and publisher; he was born on this date in 1884.

Gernsback held 80 patents at the time of his death; he founded radio station WRNY, was involved in the first television broadcasts, and is considered a pioneer in amateur radio.  But it was as a writer and publisher that he probably left his most lasting mark:  In 1926, as owner/publisher of the magazine Modern Electrics, he filled a blank spot in his publication by dashing off the first chapter of a series called “Ralph 124C 41+.” The twelve installments of “Ralph” were filled with inventions unknown in 1926, including “television” (Gernsback is credited with introducing the word), fluorescent lighting, juke boxes, solar energy, television, microfilm, vending machines, and the device we now call radar.

The “Ralph” series was an astounding success with readers; and later that year Gernsback founded the first magazine devoted to science fiction, Amazing Stories.  Believing that the perfect sci-fi story is “75 percent literature interwoven with 25 percent science,” he coined the term “science fiction.”

Gernsback was a “careful” businessman, who was tight with the fees that he paid his writers– so tight that H. P. Lovecraft and Clark Ashton Smith referred to him as “Hugo the Rat.”

Still, his contributions to the genre as publisher were so significant that, along with H.G. Wells and Jules Verne, he is sometimes called “The Father of Science Fiction”; in his honor, the annual Science Fiction Achievement awards are called the “Hugos.”

(Coincidentally, today is also the birthday– in 1906– of Philo T. Farnsworth, the man who actually did invent television… and was thus the inspiration for the name “Philco.”)

[UPDATE- With thanks to friend MK for the catch:  your correspondent was relying on an apocryphal tale in attributing the Philco brand name to to Philo Farnsworth.  Farsworth did work with the company, and helped them enter the television business.  But the Philco trademark dates back to 1919– pre-television days– as a label for what was then the Philadelphia Storage Battery Company.]

Gernsback, wearing one of his inventions, TV Glasses




“PLATO was my Alexandria. It was my library, it was the place where I could attach myself to anything.”*…




One upon a time [in the 60s and 70s], there was a computer network with thousands of users across the world. It featured chat rooms, message boards, multiplayer games, a blog-like newspaper, and accredited distance learning, all piped to flat-panel plasma screens that were also touchscreens. And it wasn’t the internet.

It was PLATO (Programmed Logic for Automatic Teaching Operations), and its original purpose was to harness the power of the still-obscure world of computing as a teaching tool. Developing PLATO required simultaneous quantum leaps in technological sophistication, and it worked—college and high-school students quickly learned how to use it, and also pushed it to do new things.

Despite decades of use at major universities, it all but vanished in the 1980s and from popular memory in the years that followed, a victim of the microcomputer revolution. At its peak, PLATO was surprisingly similar to the modern internet, and it left its DNA in technology we still use today…

The story of the ur-internet: “PLATO.”

* novelist Richard Powers (who was a coder before he turned to literary fiction)


As we log on, we might send super birthday greetings to Seymour Roger Cray; he was born on this date in 1925.  An electrical engineer and computer architect, he designed a series of computers that were the fastest in the world for decades, and founded Cray Research which built many of these machines– effectively creating the “supercomputer” industry and earning the honorific “father of supercomputing.”


With a Cray-1



Written by LW

September 28, 2019 at 1:01 am

“Knowledge speaks, but wisdom listens”*…



You might think that digital technologies, often considered a product of ‘the West,’ would hasten the divergence of Eastern and Western philosophies. But within the study of Vedanta, an ancient Indian school of thought, I see the opposite effect at work. Thanks to our growing familiarity with computing, virtual reality (VR) and artificial intelligence (AI), ‘modern’ societies are now better placed than ever to grasp the insights of this tradition.

Vedanta summarises the metaphysics of the Upanishads, a clutch of Sanskrit religious texts, likely written between 800 and 500 BCE. They form the basis for the many philosophical, spiritual and mystical traditions of the Indian sub-continent. The Upanishads were also a source of inspiration for some modern scientists, including Albert Einstein, Erwin Schrödinger and Werner Heisenberg, as they struggled to comprehend quantum physics of the 20th century…

Philosopher and Vaishnava Hindu theologian Akhandadhi Das. a member of the Science and Philosophy Initiative, explains how “Modern technology is akin to the metaphysics of Vedanta.”

* Jimi Hendrix


As we muse on metaphor, we might send carefully-constructed birthday greetings to Donald Knuth; he was born on this date in 1938. A computer scientist, mathematician, and professor emeritus at Stanford, he made numerous substantive contributions to computer science, both practically and theoretically.  But he is probably best known as the author of the multi-volume work The Art of Computer Programming, which he began in 1962, began to publish in 1968… and has (via multiple revisions/additions) still not finished.  Called by the New York Times “the profession’s defining treatise,” it won Knuth the Turing Award in 1974.

That said, it’s surely worth noting Knuth’s other major contribution to our modern zeitgeist: his “Potrzebie System of Weights and Measures,” published in Issue 33 of Mad Magazine when he was 19 years old.

192px-knuthatopencontentalliance source


Written by LW

January 10, 2019 at 1:01 am

<span>%d</span> bloggers like this: