(Roughly) Daily

Posts Tagged ‘eclipse

“Memento mori”*…

Via Matt Muir and his wonderful newsletter, Web Curios

Do YOU want to feel INCREDIBLY OLD? I mean, I imagine you already feel incredibly old, what with, well, your age, and the terrifying pace of everything, and the fact that nothing makes sense, but if you would like to feel EVEN OLDER then you will ‘enjoy’ this site which presents a whole load of ‘XX was closer to WWII than Y’-type facts to help you realise quite how much time has passed since you’ve been alive and how transient and ephemeral your existence in fact is. You can even plug in your birthday to get personalised horrordata…

Try it for yourself at: “Another day closer to the end.” No extra points for guessing that the example illustrated in the pic above was generated with your (very old) correspondent’s birthday.

On the other hand?: “Your future self has good news: you’re doing pretty well.”

* Latin phrase: “remember that you will die”

###

As we muse on mutability and mortality, we might note that this is Good Friday… and that in 1983, British scholars Colin Humphreys and W.G. Waddington used NASA research into historical eclipse to place the Crucifixion on this date in 33 CE.

Pieter Brueghel the Younger (source)

Written by (Roughly) Daily

April 3, 2026 at 1:00 am

“The moon is a friend for the lonesome to talk to”*…

… and, as Bartosz Ciechanowski explains in a stunningly-illustrated essay, so much more. The moon affects our tides, our light, and even the Earth’s rotation– it’s no wonder that our constant companion has so haunted human culture…

… The Moon may be just an unassuming neighbor in the sky, but its presence affects our lives in many subtle ways. When it reflects sunlight off its scarred surface to guide the way in the darkness of night, or as it breathes life into oceans by rhythmically raising tides, or when it cloaks the Sun in a rare and awe-inspiring total solar eclipse, the Moon reminds us of the celestial world right outside of the safe confines of our planet.

Traveling through the cold and empty space by Earth’s side, the Moon is always just there. It may be barren and dull, but, undeterred by its own lifelessness, it never leaves us completely alone.

Perhaps the next time you catch a glimpse of the Moon’s shiny surface beaming in the night sky, you’ll see it a little differently – not as a mundane fixture of the heavens, but as a fellow companion that gently affects our own existence…

Absolutely fascinating– and beautiful: “Moon,” from @bciechanowski.bsky.social. Via @TheBrowser.

This is (R)D‘s second visit with Ciechanowski, who earlier helped us understand “Sound”; for more of his extraordinary work, visit his archive.

More on the moon and here.

* Carl Sandburg

###

As we raise our eyes, we might send celestial birthday greetings to William Wilson Morgan; he was born on this date in 1906. As astronomer and astrophysicist, he was professor and astronomy director for the University of Chicago’s Yerkes Observatory in Wisconsin and managing editor of Astrophysical Journal. He was a leader in stellar and galaxy classification and helped prove the existence of spiral arms in our galaxy.

source

“Oh dark, dark, dark, amid the blaze of noon, irrevocably dark, total eclipse without all hope of day”*…

Today is the occasion of an annular eclipse, which will pass through eight U.S. states before crossing the Gulf of Mexico and to transit Mexico, Guatemala, Belize, Honduras, Nicaragua, Costa Rica, Panama, Colombia, and Brazil. While some people in the Western Hemisphere will witness a “ring of fire” during the eclipse, many more will experience the phenomenon of crescent sunlight. Rebecca Boyle has advice on how we might approach it…

… This Saturday, for some people in the Western Hemisphere, the Sun will disappear for a few minutes and appear to leave a flaming hole in the sky. Instead of a ball of fire, the Sun will transform into a ring of fire, a strange and wondrous sight. This is an annular solar eclipse, and it happens because the Moon is right smack in front of the Sun.

A solar eclipse only happens during new Moon phases, when we otherwise wouldn’t be able to see our nearest celestial companion. Though we get a new Moon every month, we do not get solar eclipses as often because of our satellite’s oddball path around the planet. Sometimes the Moon casts a shadow just above Earth, and sometimes just below. This weekend, the Moon’s shadow will fall onto Earth, just right for people in parts of the Western Hemisphere to see it.

The annular eclipse is a preview of a more incredible, rarer event next April, when a total solar eclipse will cross the continental United States. There is no experience on Earth like a total eclipse; make plans to see it, if you can. But this weekend’s “ring of fire” eclipse is an event you should try to see first (safely, with eclipse glasses), if you can get yourself into the western U.S. or parts of Central and South America. Here’s a map showing the eclipse path; if you can’t travel to see it in person, you can watch the eclipse online.

Eclipses happen because the Sun and Moon appear to be roughly the same diameter. The Sun is actually about 400 times larger than the Moon, but it is also about 400 times more distant, so they seem like the same size in our sky.

The Moon’s shadow forms two concentric cones, composed of an inner shadow called the umbra, where the sun is completely obscured, and an outer, broader cone called a penumbra, where sunlight still shines but it is partially blocked. The umbra can be seen in a narrow geographic ribbon across the Americas, and it’s where you will see a full eclipse; under the penumbra, which covers much of the western U.S., Central and South America, you will see a partial eclipse.

Like the gears of a clock, a combination of precise positions and movements initiate an eclipse of the Sun. As Earth spins, day breaks. The Sun and Moon appear to trace a path across the sky. The Sun is not moving (at least not perceptibly); Earth’s rotation makes the star’s position change. The Moon is moving around us while the Earth rotates, so it seems to move too, but it appears to go slower than our star. The partial solar eclipse begins as the Sun catches up to the Moon’s position in our sky. On Saturday morning around 8:06 a.m. Pacific time, people in Eugene, Oregon, will be the first to see the Moon appear to take a bite out of the Sun. The bite will get progressively bigger until the full annular eclipse begins at 9:16 a.m. Pacific time.

The annular eclipse only lasts about four minutes (depending on your precise location under the Moon’s shadow) but the partial eclipse, which will be visible over a much wider geographic area, lasts about an hour and 15 minutes before and afterward. During this phase, shadows cast by objects on Earth will change in unusual ways. One lovely place to be during a partial solar eclipse is underneath a tree, if you can find an evergreen or a deciduous tree that has not dropped its leaves yet. Look at the ground. In the dappled light, you will see crescents everywhere: the crescent Sun.

Sunlight is the heavens reaching down to touch us right where we stand; I think about this when I step into the light. But crescent sunlight is the Moon joining this experience. Its darkness, rather than its light, reaches out to touch us, too…

An informative and lyrical guide to today’s eclipse: “During an Annular Eclipse, Look to the Shadows,” from @rboyle31 in @atlasobscura.

* John Milton, Samson Agonistes

###

As we don’t look directly, we might recall that on this date in 1609, Galileo (who has claim to the titles Father of observational astronomy, modern-era classical physics, the scientific method, and modern science) put the telescope to use in his astronomical work. Upon hearing (at age 40) that a Dutch optician had invented a glass that made distant objects appear larger, Galileo crafted his telescope. He continued to improve his device, ultimately achieving 30X magnification, and recorded his observations of the Moon, the moons of Jupiter, the Phases of Venus, Sunspots, The Milky Way, and more. He published his initial telescopic astronomical observations in March 1610 in a brief treatise entitled Sidereus Nuncius (Starry Messenger).

Telescopes were also a profitable sideline for Galileo, who sold them to merchants who found them useful both at sea and as items of trade.

Galileo’s “cannocchiali” telescopes at the Museo Galileo, Florence (source)

Written by (Roughly) Daily

October 14, 2023 at 1:00 am

“If we are to prevent megatechnics from further controlling and deforming every aspect of human culture, we shall be able to do so only with the aid of a radically different model derived directly, not from machines, but from living organisms and organic complexes (ecosystems)”*…

In a riff on Lewis Mumford, the redoubtable L. M. Sacasas addresses the unraveling of modernity…

The myth of the machine underlies a set of three related and interlocking presumptions which characterized modernity: objectivity, impartiality, and neutrality. More specifically, the presumptions that we could have objectively secured knowledge, impartial political and legal institutions, and technologies that were essentially neutral tools but which were ordinarily beneficent. The last of these appears to stand somewhat apart from the first two in that it refers to material culture rather than to what might be taken as more abstract intellectual or moral stances. In truth, however, they are closely related. The more abstract intellectual and institutional pursuits were always sustained by a material infrastructure, and, more importantly, the machine supplied a master template for the organization of human affairs.

Just as the modern story began with the quest for objectively secured knowledge, this ideal may have been the first to lose its implicit plausibility. Since the late 19th century onward, philosophers, physicists, sociologists, anthropologists, psychologists, and historians have, among others, proposed a more complex picture that emphasized the subjective, limited, contingent, situated, and even irrational dimensions of how humans come to know the world. The ideal of objectively secured knowledge became increasingly questionable throughout the 20th century. Some of these trends get folded under the label “postmodernism,” but I found the term unhelpful at best a decade ago—now find it altogether useless.

We can similarly trace a growing disillusionment with the ostensible impartiality of modern institutions. This takes at least two forms. On the one hand, we might consider the frustrating and demoralizing character of modern bureaucracies, which we can describe as rule-based machines designed to outsource judgement and enhance efficiency. On the other, we can note the heightened awareness of the actual failures of modern institutions to live up to the ideals of impartiality, which has been, in part, a function of the digital information ecosystem.

But while faith in the possibility of objectively secured knowledge and impartial institutions faltered, the myth of the machine persisted in the presumption that technology itself was fundamentally neutral. Until very recently, that is. Or so it seems. And my thesis (always for disputation) is that the collapse of this last manifestation of the myth brings the whole house down. This in part because of how much work the presumption of technological neutrality was doing all along to hold American society together. (International readers: as always read with a view to your own setting. I suspect there are some areas of broad overlap and other instances when my analysis won’t travel well). Already by the late 19th century, progress had become synonymous with technological advancements, as Leo Marx argued. If social, political, or moral progress stalled, then at least the advance of technology could be counted on…

But over the last several years, the plausibility of this last and also archetypal manifestation of the myth of the machine has also waned. Not altogether, to be sure, but in important and influential segments of society and throughout a wide cross-section of society, too. One can perhaps see the shift most clearly in the public discourse about social media and smart phones, but this may be a symptom of a larger disillusionment with technology. And not only technological artifacts and systems, but also with the technocratic ethos and the public role of expertise.

If the myth of the machine in these three manifestations, was, in fact, a critical element of the culture of modernity, underpinning its aspirations, then when each in turn becomes increasingly implausible the modern world order comes apart. I’d say that this is more or less where we’re at. You could usefully analyze any number of cultural fault lines through this lens. The center, which may not in fact hold, is where you find those who still operate as if the presumptions of objectivity, impartiality, and neutrality still compelled broad cultural assent, and they are now assailed from both the left and the right by those who have grown suspicious or altogether scornful of such presumptions. Indeed, the left/right distinction may be less helpful than the distinction between those who uphold some combination of the values of objectivity, impartiality, and neutrality and those who no longer find them compelling or desirable.

What happens when the systems and strategies deployed to channel often violent clashes within a population deeply, possibly intractably divided about substantive moral goods and now even about what Arendt characterized as the publicly accessible facts upon which competing opinions could be grounded—what happens when these systems and strategies fail?

It is possible to argue that they failed long ago, but the failure was veiled by an unevenly distributed wave of material abundance. Citizens became consumers and, by and large, made peace with the exchange. After all, if the machinery of government could run of its own accord, what was their left to do but enjoy the fruits of prosperity. But what if abundance was an unsustainable solution, either because it taxed the earth at too high a rate or because it was purchased at the cost of other values such as rootedness, meaningful work and involvement in civic life, abiding friendships, personal autonomy, and participation in rich communities of mutual care and support? Perhaps in the framing of that question, I’ve tipped my hand about what might be the path forward.

At the heart of technological modernity there was the desire—sometimes veiled, often explicit—to overcome the human condition. The myth of the machine concealed an anti-human logic: if the problem is the failure of the human to conform to the pattern of the machine, then bend the human to the shape of the machine or eliminate the human altogether. The slogan of the one of the high-modernist world’s fairs of the 1930s comes to mind: “Science Finds, Industry Applies, Man Conforms.” What is now being discovered in some quarters, however, is that the human is never quite eliminated, only diminished…

Eminently worth reading in full: “The Myth of the Machine, ” from @LMSacasas.

For a deep dive into similar waters, see John Ralston Saul‘s (@JohnRalstonSaul) Voltaire’s Bastards.

[Image above: source]

* Lewis Mumford, The Myth of the Machine

###

As we rethink rudiments, we might recall that it was on this date in 1919 that Arthur Eddington confirmed Einstein’s light-bending prediction– a part of The Theory of General Relativity– using photos of a solar eclipse. Eddington’s paper the following year was the “debut” of Einstein’s theoretical work in most of the English-speaking world (and occasioned an urban legend: when a reporter supposedly suggested that “only three people understand relativity,” Eddington was supposed to have jokingly replied “Oh, who’s the third?”)

One of Eddington’s photographs of the total solar eclipse of 29 May 1919, presented in his 1920 paper announcing its success, confirming Einstein’s theory that light “bends”

“The speed of time is 1 hour per hour, no matter what else is going on in the universe”*…

All the Light You See” (02017–02019) by Alicia Eggert. Photo by Ryan Strand Greenberg.

The most commonly-used noun in the English language is, according to the Oxford English Corpus, time. Its frequency is partly due to its multiplicity of meanings, and partly due to its use in common phrases. Above all, “time” is ubiquitous because what it refers to dictates all aspects of human life, from the hour we rise to the hour we sleep and most everything in between.

But what is time? The irony, of course, is that it’s hard to say. Trying to pin down its meaning using words can oftentimes feel like grasping at a wriggling fish. The 4th century Christian theologian Saint Augustine sums up the dilemma well:

But what in speaking do we refer to more familiarly and knowingly than time? And certainly we understand when we speak of it; we understand also when we hear it spoken of by another. What, then, is time? If no one asks me, I know; if I wish to explain to him who asks, I know not.

Most of us are content to live in a world where time is simply what a clock reads. The interdisciplinary artist Alicia Eggert is not. Through co-opting clocks and forms of commercial signage (billboards, neon signs, inflatable nylon of the kind that animates the air dancers in the parking lots of auto dealerships), Eggert makes conceptual art that invites us to experience the dimensions of time through the language we use to talk about it.

Her art draws on theories of time from physics and philosophy, like the inseparability of time and space and the difference between being and becoming. She expresses these oftentimes complex ideas through simple words and phrases we make use of in our everyday lives, thereby making them tangible and relatable…

From Ahmed Kabil (@ahmedkabil) and The Long Now Foundation, a (wonderfully-illustrated) appreciation of the art of Alicia Eggert (@AliciaEggert) and the questions it addresses: “How Long is Now?

Sean M. Carroll

###

As we tackle time, we might recall that it was on this date in 585 BCE that a solar eclipse occurred. According to The Histories of Herodotus, the Greek philosopher Thales of Miletus accurately predicted the event. (If Herodotus’s account is accurate, this eclipse is the earliest recorded as being known in advance of its occurrence.)

According to Herodotus, the appearance of the eclipse was interpreted as an omen, and interrupted a battle in a long-standing war between the Medes and the Lydians. The fighting immediately stopped, and they agreed to a truce. Because astronomers can calculate the dates of historical eclipses, Isaac Asimov described this battle as the earliest historical event whose date is known with precision to the day, and called the prediction “the birth of science”; any case “the eclipse of Thales” is one of the cardinal dates from which other dates can be calculated.

source

Written by (Roughly) Daily

May 28, 2021 at 1:01 am