(Roughly) Daily

Posts Tagged ‘history of science

“Few things are more enjoyable than lingering over the atlas and plotting a trip”*…


atlas of outer space


I’m excited to finally share a new design project this week! Over the past year and a half I’ve been working on a collection of ten maps on planets, moons, and outer space. To name a few, I’ve made an animated map of the seasons on Earth, a map of Mars geology, and a map of everything in the solar system bigger than 10km…

Data visualizer extraordinaire Eleanor Lutz has announced “An Atlas of Space.”

Follow her progress on her blog Tabletop Whale, or on Twitter or Tumblr.

[TotH to Kottke]

* J. Maarten Troost


As we see stars, we might spare a thought for Daniel Kirkwood; he died on this date in 1895. Kirkwood’s most significant contribution came from his study of asteroid orbits. When arranging the then-growing number of discovered asteroids by their distance from the Sun, he noted several gaps, now named Kirkwood gaps in his honor, and associated these gaps with orbital resonances with the orbit of Jupiter.  Further, Kirkwood also suggested a similar dynamic was responsible for Cassini Division in Saturn’s rings, as the result of a resonance with one of Saturn’s moons.  In the same paper, he was the first to correctly posit that the material in meteor showers is cometary debris.

Kirkwood also identified a pattern relating the distances of the planets to their rotation periods, which was called Kirkwood’s Law. This discovery earned Kirkwood an international reputation among astronomers; he was dubbed “the American Kepler” by Sears Cook Walker, who claimed that Kirkwood’s Law proved the widely held Solar Nebula Theory.  (In the event, the “Law” has since become discredited as new measurements of planetary rotation periods have shown that the pattern doesn’t hold.)

Daniel_Kirkwood source


“The average scientist unequipped with the powerful lenses of philosophy, is a nearsighted creature, and cheerfully attacks each difficulty in the hope that it may prove to be the last”*…




There are decisive grounds for holding that we need to bring about a revolution in philosophy, a revolution in science, and then put the two together again to create a modern version of natural philosophy.

Once upon a time, it was not just that philosophy was a part of science; rather, science was a branch of philosophy. We need to remember that modern science began as natural philosophy – a development of philosophy, an admixture of philosophy and science. Today, we think of Galileo, Johannes Kepler, William Harvey, Robert Boyle, Christiaan Huygens, Robert Hooke, Edmond Halley and, of course, Isaac Newton as trailblazing scientists, while we think of Francis Bacon, René Descartes, Thomas Hobbes, John Locke, Baruch Spinoza and Gottfried Leibniz as philosophers. That division is, however, something we impose on the past. It is profoundly anachronistic…

Science broke away from metaphysics, from philosophy, as a result of natural philosophers adopting a profound misconception about the nature of science. As a result, natural philosophy died, the great divide between science and philosophy was born, and the decline of philosophy began.

It was Newton who inadvertently killed off natural philosophy with his claim, in the third edition of his Principia, to have derived his law of gravitation from the phenomena by induction…

Nicholas Maxwell argues that science and philosophy need to be re-joined, lest humanity seek knowledge at the expense of wisdom; only then, he suggests, can we hope to solve the urgent, fundamental problems that we face: “Natural philosophy redux.”

[Image above: source]

* Gilbert N. Lewis


As we seek ever-higher ground, we might that it was on this date in 1898 that the heirs of Alfred Nobel signed a “reconciliation agreement,” allowing his lawyers and accountants to execute his will.  The lion’s share of his estate was clearly marked for the establishment of the eponymous Prizes that are awarded each year.  But the residue, which was to be divided among descendants was the subject of much contention.


The first page of Nobel’s will [source]


“It’s tough to make predictions, especially about the future”*…




As astrophysicist Mario Livo recounts in Brilliant Blunders, in April 1900, the eminent physicist Lord Kelvin proclaimed that our understanding of the cosmos was complete except for two “clouds”—minor details still to be worked out. Those clouds had to do with radiation emissions and with the speed of light… and they pointed the way to two major revolutions in physics: quantum mechanics and the theory of relativity.  Prediction is hard; ironically, it’s especially hard for experts attempting foresight in their own fields…

The idea for the most important study ever conducted of expert predictions was sparked in 1984, at a meeting of a National Research Council committee on American-Soviet relations. The psychologist and political scientist Philip E. Tetlock was 30 years old, by far the most junior committee member. He listened intently as other members discussed Soviet intentions and American policies. Renowned experts delivered authoritative predictions, and Tetlock was struck by how many perfectly contradicted one another and were impervious to counterarguments.

Tetlock decided to put expert political and economic predictions to the test. With the Cold War in full swing, he collected forecasts from 284 highly educated experts who averaged more than 12 years of experience in their specialties. To ensure that the predictions were concrete, experts had to give specific probabilities of future events. Tetlock had to collect enough predictions that he could separate lucky and unlucky streaks from true skill. The project lasted 20 years, and comprised 82,361 probability estimates about the future.

The result: The experts were, by and large, horrific forecasters. Their areas of specialty, years of experience, and (for some) access to classified information made no difference. They were bad at short-term forecasting and bad at long-term forecasting. They were bad at forecasting in every domain. When experts declared that future events were impossible or nearly impossible, 15 percent of them occurred nonetheless. When they declared events to be a sure thing, more than one-quarter of them failed to transpire. As the Danish proverb warns, “It is difficult to make predictions, especially about the future.”…

One subgroup of scholars, however, did manage to see more of what was coming… they were not vested in a single discipline. They took from each argument and integrated apparently contradictory worldviews…

The integrators outperformed their colleagues in pretty much every way, but especially trounced them on long-term predictions. Eventually, Tetlock bestowed nicknames (borrowed from the philosopher Isaiah Berlin) on the experts he’d observed: The highly specialized hedgehogs knew “one big thing,” while the integrator foxes knew “many little things.”…

Credentialed authorities are comically bad at predicting the future. But reliable– at least more reliable– forecasting is possible: “The Peculiar Blindness of Experts.”

See Tetlock discuss his findings at a Long Now Seminar.  Read Berlin’s riff on Archilochus, “The Hedgehog and the Fox,” here.

* Yogi Berra


As we ponder prediction, we might send complicating birthday greetings to Edward Norton Lorenz; he was born on this date in 1917.  A mathematician who turned to meteorology during World War II, he established the theoretical basis of weather and climate predictability, as well as the basis for computer-aided atmospheric physics and meteorology.

But he is probably better remembered as the founder of modern chaos theory, a branch of mathematics focusing on the behavior of dynamical systems that are highly sensitive to initial conditions… and thus practically impossible to predict in detail with certainty.

In 1961, Lorenz was using a simple digital computer, a Royal McBee LGP-30, to simulate weather patterns by modeling 12 variables, representing things like temperature and wind speed. He wanted to see a sequence of data again, and to save time he started the simulation in the middle of its course. He did this by entering a printout of the data that corresponded to conditions in the middle of the original simulation. To his surprise, the weather that the machine began to predict was completely different from the previous calculation. The culprit: a rounded decimal number on the computer printout. The computer worked with 6-digit precision, but the printout rounded variables off to a 3-digit number, so a value like 0.506127 printed as 0.506. This difference is tiny, and the consensus at the time would have been that it should have no practical effect. However, Lorenz discovered that small changes in initial conditions produced large changes in long-term outcome. His work on the topic culminated in the publication of his 1963 paper “Deterministic Nonperiodic Flow” in Journal of the Atmospheric Sciences, and with it, the foundation of chaos theory…

His description of the butterfly effect, the idea that small changes can have large consequences, followed in 1969.

lorenz source


“The midpoint in medicine between excessive emotional involvement with patients and a complete lack of empathy is not a simple one to locate”*…


elphant nose


In sixteenth-century Leuven, a troubled man sent for a physician to help him with his unusually long nose. The man believed that his nose was of ‘such a prodigious length’, it resembled the ‘snoute’ of an elephant. It hindered him in everything he did, to the extent that sometimes it ‘lay in the dish’ where his supper was served. His physician, at this point, artfully and carefully, ‘conveighed a long pudding’ onto the nose of the desperate man, and then with a Barber’s razor ‘finely cut away’ the offending pudding nose while his patient was drowsy from a sleeping draft. The physician prescribed him a wholesome diet and sent the man away, relieved of his extraordinarily long nose, and the burden of ‘fear of harme and inconvenience.’

This case history was described in the English translation of the medical treatise, The Touchstone of Complexions (1576) by the Dutch physician, Levinus Lemnius, as an example of ‘melancholicke’ fantasy. Instead of assuming the man was possessed by a malevolent spirit or demon (a possible diagnosis at this time), that he was a ‘lunatic’ and beyond treatment, or dismissing his delusion to his face, the sixteenth-century physician in the story entered into the world of the ‘phantasie’ to try and help his patient’s obvious distress.

We very rarely read histories of incidents from this period where physicians are concerned for the emotional and mental wellbeing of their patients to this degree. Usually the tendency has been to emphasize the ‘barbarous and debilitating’ treatments of early modern medicine – its bloodletting, purging, and surgery without anaesthetic, or to highlight the moralizing religious doctrine behind treatments of illnesses of the mind or ‘passions’. Yet, here was a doctor trying an imaginative solution to a problem he believed stemmed from an imbalance of the humour ‘melancholy’ in his patient’s body….

More examples of empathetic early healers and the bizarre cases they “cured– the man with frogs in stomach, the man whose buttocks were made of glass– at “The Man with an Elephant’s Nose.”

* Christine Montross, Body of Work: Meditations on Mortality from the Human Anatomy Lab


As we make it better, we might send revolutionary birthday greetings to Edward Donnall “Don” Thomas; he was born on this date in 1920.  A physician and medical researcher, Thomas shared (with Joseph E. Murray) the 1990 Nobel Prize for Physiology or Medicine for his work in transplanting bone marrow from one person to another – an achievement related to the cure of patients with acute leukemia and other blood cancers or blood diseases.  Although this prize usually goes to scientists doing basic research with test tubes, Thomas was a doctor doing hands-on clinical research with patients.

230px-Edward_Donnall__Don__Thomas source


Written by LW

March 15, 2019 at 12:01 am

“You’d be surprised how much it costs to look this cheap”*…


Fast Fashion


Remembering that the world has roughly 7.7. billion inhabitants…

In 2015, the fashion industry churned out 100 billion articles of clothing, doubling production from 2000, far outpacing global population growth. In that same period, we’ve stopped treating our clothes as durable, long-term purchases. The Ellen MacArthur Foundation has found that clothing utilization, or how often we wear our clothes, has dropped by 36% over the past decade and a half, and many of us wear clothes only 7 to 10 times before it ends up in a landfill. Studies show that we only really wear 20% of our overflowing closets.

For the past few years, we’ve pointed the finger at fast-fashion brands like H&M, Zara, and Forever21, saying that they are responsible for this culture of overconsumption. But that’s not entirely fair. The vast majority of brands in the $1.3 billion [sic- it’s $trillion] fashion industry–whether that’s Louis Vuitton or Levi’s–measure growth in terms of increasing production every year. This means not just convincing new customers to buy products, but selling more and more to your existing customers. Right now, apparel companies make 53 million tons of clothes into the world annually. If the industry keeps up its exponential pace of growth, it is expected to reach 160 million tons by 2050….

Churning out so many clothes has enormous environmental costs that aren’t immediately obvious to consumers. But it is becoming increasingly clear that the fashion industry is contributing the the rapid destruction of our planet. A United Nations report says that we’re on track to increase the world’s temperature by 2.7 degrees by 2040, which will flood our coastlines, intensify droughts, and lead to food shortages. Activists, world leaders, and the public at large are just beginning to reckon with the way the fashion industry is accelerating the pace of climate change…

It’s not just our closets that are suffering: “We have to fix fashion if we want to survive the climate crisis.”

The apparel industry is not, of course, unaware of all of this.  For a look at how they are responding, see Ad Age‘s “How Sustainability in Fashion Went From The Margins To The Mainstream“… and draw your own conclusion as to efficacy.

[photo above: Flickr user Tofuprod]

* Dolly Parton


As we wean ourselves from whopping-great wardrobes, we might spare a thought for a man who contributed t our ability to measure our progress (or lack thereof) in addressing climate change: George James Symons; he died on this date in 1900.  A British meteorologist who was obsessed with increasing the accuracy of measurement, he devoted his career to improving meteorological records by raising measurement standards for accuracy and uniformity, and broadening the coverage (with more reporting stations, increasing their number from just 168 at the start of his career to 3,500 at the time of his death).  The Royal Meteorological Society (to which he was admitted at age 17) established a gold medal in his memory, awarded for services to meteorological science.

150px-GeorgeJamesSymons(1838-1900) source


“It is unwise to be too sure of one’s own wisdom. It is healthy to be reminded that the strongest might weaken and the wisest might err.”*…



An 1874 engraving showing a probably apocryphal account of Newton’s lab fire. In the story, Newton’s dog started the fire, burning 20 years of research. Newton is thought to have said: “O Diamond, Diamond, thou little knowest the mischief thou hast done.”


Imagine a black box which, when you pressed a button, would generate a scientific hypothesis. 50% of its hypotheses are false; 50% are true hypotheses as game-changing and elegant as relativity. Even despite the error rate, it’s easy to see this box would quickly surpass space capsules, da Vinci paintings, and printer ink cartridges to become the most valuable object in the world. Scientific progress on demand, and all you have to do is test some stuff to see if it’s true? I don’t want to devalue experimentalists. They do great work. But it’s appropriate that Einstein is more famous than Eddington. If you took away Eddington, someone else would have tested relativity; the bottleneck is in Einsteins. Einstein-in-a-box at the cost of requiring two Eddingtons per insight is a heck of a deal.

What if the box had only a 10% success rate? A 1% success rate? My guess is: still most valuable object in the world. Even an 0.1% success rate seems pretty good, considering (what if we ask the box for cancer cures, then test them all on lab rats and volunteers?) You have to go pretty low before the box stops being great.

I thought about this after reading this list of geniuses with terrible ideas. Linus Pauling thought Vitamin C cured everything. Isaac Newton spent half his time working on weird Bible codes. Nikola Tesla pursued mad energy beams that couldn’t work. Lynn Margulis revolutionized cell biology by discovering mitochondrial endosymbiosis, but was also a 9-11 truther and doubted HIV caused AIDS. Et cetera. Obviously this should happen. Genius often involves coming up with an outrageous idea contrary to conventional wisdom and pursuing it obsessively despite naysayers. But nobody can have a 100% success rate. People who do this successfully sometimes should also fail at it sometimes, just because they’re the kind of person who attempts it at all. Not everyone fails. Einstein seems to have batted a perfect 1000 (unless you count his support for socialism). But failure shouldn’t surprise us…

Some of the people who have most contributed to our understanding of the world have been inexcusably wrong on basic issues.  But, as Scott Alexander argues, you only need one world-changing revelation to be worth reading: “Rule Thinkers In, Not Out.”

* Mahatma Gandhi


As we honor insight where we find it, we might send carefully-addressed birthday greetings to Infante Henrique of Portugal, Duke of Viseu, better known as Prince Henry the Navigator; he was born on this date in 1394.  A central figure in 15th-century Portuguese politics and in the earliest days of the Portuguese Empire, Henry encouraged Portugal’s expeditions (and colonial conquests) in Africa– and thus is regarded as the main initiator (as a product both of Portugal’s expeditions and of those that they encouraged by example) of what became known as the Age of Discoveries.




“It is not the strongest or the most intelligent who will survive but those who can best manage change”*…




In the near future, we will be in possession of genetic engineering technology which allows us to move genes precisely and massively from one species to another. Careless or commercially driven use of this technology could make the concept of species meaningless, mixing up populations and mating systems so that much of the individuality of species would be lost. Cultural evolution gave us the power to do this. To preserve our wildlife as nature evolved it, the machinery of biological evolution must be protected from the homogenizing effects of cultural evolution.

Unfortunately, the first of our two tasks, the nurture of a brotherhood of man, has been made possible only by the dominant role of cultural evolution in recent centuries. The cultural evolution that damages and endangers natural diversity is the same force that drives human brotherhood through the mutual understanding of diverse societies. Wells’s vision of human history as an accumulation of cultures, Dawkins’s vision of memes bringing us together by sharing our arts and sciences, Pääbo’s vision of our cousins in the cave sharing our language and our genes, show us how cultural evolution has made us what we are. Cultural evolution will be the main force driving our future…

An important essay by Freeman Dyson, emeritus professor of physics at the Institute for Advanced Study in Princeton: “Biological and Cultural Evolution– Six Characters in Search of an Author.”

* Leon C. Megginson (often misattributed to Darwin, on whose observations Megginson based his own)


As we agonize over the anthropocene, we might recall that it was on this date in 1968 that James Watson’s The Double Helix was published.  A memoir, it describes– with manifest hubris– “perhaps the most famous event in biology since Darwin’s book,” the 1953 discovery, published by Watson and Francis Crick, of the now-famous double helix structure of the DNA molecule.

Crick, however, viewed Watson’s book as “far too much gossip,” and believed it gave short shrift to Rosalind Franklin’s vital contribution via clues from her X-ray crystallography results.  It was originally slated to be published by Harvard University Press, Watson’s home university,  Harvard dropped the arrangement after protests from Francis Crick and Maurice Wilkins (their supervisor, who shared the Nobel Prize);  it was published instead by Atheneum in the United States and Weidenfeld & Nicolson in the UK.

In any case, the book opens a window into the competitiveness, struggles, doubts, and human foibles that were baked into this landmark in science.

220px-TheDoubleHelix source


%d bloggers like this: