(Roughly) Daily

Posts Tagged ‘reason

“I used to measure the skies, now I measure the shadows of Earth”*…

From ancient Egyptian cubits to fitness tracker apps, humankind has long been seeking ever more ways to measure the world – and ourselves…

The discipline of measurement developed for millennia… Around 6,000 years ago, the first standardised units were deployed in river valley civilisations such as ancient Egypt, where the cubit was defined by the length of the human arm, from elbow to the tip of the middle finger, and used to measure out the dimensions of the pyramids. In the Middle Ages, the task of regulating measurement to facilitate trade was both privilege and burden for rulers: a means of exercising power over their subjects, but a trigger for unrest if neglected. As the centuries passed, units multiplied, and in 18th-century France there were said to be some 250,000 variant units in use, leading to the revolutionary demand: “One king, one law, one weight and one measure.”

It was this abundance of measures that led to the creation of the metric system by French savants. A unit like the metre – defined originally as one ten-millionth of the distance from the equator to the north pole – was intended not only to simplify metrology, but also to embody political ideals. Its value and authority were derived not from royal bodies, but scientific calculation, and were thus, supposedly, equal and accessible to all. Then as today, units of measurement are designed to create uniformity across time, space and culture; to enable control at a distance and ensure trust between strangers. What has changed since the time of the pyramids is that now they often span the whole globe.

Despite their abundance, international standards like those mandated by NIST and the International Organization for Standardization (ISO) are mostly invisible in our lives. Where measurement does intrude is via bureaucracies of various stripes, particularly in education and the workplace. It’s in school that we are first exposed to the harsh lessons of quantification – where we are sorted by grade and rank and number, and told that these are the measures by which our future success will be gauged…

A fascinating survey of the history of measurement, and a consideration of its consequences: “Made to measure: why we can’t stop quantifying our lives,” from James Vincent (@jjvincent) in @guardian, an excerpt from his new book Beyond Measure: The Hidden History of Measurement.

And for a look at what it takes to perfect one of the most fundamental of those measures, see Jeremy Bernstein‘s “The Kilogram.”

* “I used to measure the skies, now I measure the shadows of Earth. Although my mind was sky-bound, the shadow of my body lies here.” – Epitaph Johannes Kepler composed for himself a few months before he died

###

As we get out the gauge, we might send thoughtfully-wagered birthday greetings Blaise Pascal; he was born on this date in 1623.  A French mathematician, physicist, theologian, and inventor (e.g.,the first digital calculator, the barometer, the hydraulic press, and the syringe), his commitment to empiricism (“experiments are the true teachers which one must follow in physics”) pitted him against his contemporary René “cogito, ergo sum” Descartes– and was foundational in the acceleration of the scientific/rationalist commitment to measurement…

 source

Happy Juneteenth!

“Nothing in life is certain except death, taxes and the second law of thermodynamics”*…

The second law of thermodynamics– asserting that the entropy of a system increases with time– is among the most sacred in all of science, but it has always rested on 19th century arguments about probability. As Philip Ball reports, new thinking traces its true source to the flows of quantum information…

In all of physical law, there’s arguably no principle more sacrosanct than the second law of thermodynamics — the notion that entropy, a measure of disorder, will always stay the same or increase. “If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations — then so much the worse for Maxwell’s equations,” wrote the British astrophysicist Arthur Eddington in his 1928 book The Nature of the Physical World. “If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.” No violation of this law has ever been observed, nor is any expected.

But something about the second law troubles physicists. Some are not convinced that we understand it properly or that its foundations are firm. Although it’s called a law, it’s usually regarded as merely probabilistic: It stipulates that the outcome of any process will be the most probable one (which effectively means the outcome is inevitable given the numbers involved).

Yet physicists don’t just want descriptions of what will probably happen. “We like laws of physics to be exact,” said the physicist Chiara Marletto of the University of Oxford. Can the second law be tightened up into more than just a statement of likelihoods?

A number of independent groups appear to have done just that. They may have woven the second law out of the fundamental principles of quantum mechanics — which, some suspect, have directionality and irreversibility built into them at the deepest level. According to this view, the second law comes about not because of classical probabilities but because of quantum effects such as entanglement. It arises from the ways in which quantum systems share information, and from cornerstone quantum principles that decree what is allowed to happen and what is not. In this telling, an increase in entropy is not just the most likely outcome of change. It is a logical consequence of the most fundamental resource that we know of — the quantum resource of information…

Is that most sacrosanct natural laws, second law of thermodynamics, a quantum phenomenon? “Physicists Rewrite the Fundamental Law That Leads to Disorder,” from @philipcball in @QuantaMagazine.

* “Nothing in life is certain except death, taxes and the second law of thermodynamics. All three are processes in which useful or accessible forms of some quantity, such as energy or money, are transformed into useless, inaccessible forms of the same quantity. That is not to say that these three processes don’t have fringe benefits: taxes pay for roads and schools; the second law of thermodynamics drives cars, computers and metabolism; and death, at the very least, opens up tenured faculty positions.” — Seth Lloyd

###

As we get down with disorder, we might spare a thought for Francois-Marie Arouet, better known as Voltaire; he died on this date in 1778.  The Father of the Age of Reason, he produced works in almost every literary form: plays, poems, novels, essays, and historical and scientific works– more than 2,000 books and pamphlets (and more than 20,000 letters).  He popularized Isaac Newton’s work in France by arranging a translation of Principia Mathematica to which he added his own commentary.

A social reformer, Voltaire used satire to criticize the intolerance, religious dogma, and oligopolistic privilege of his day, perhaps nowhere more sardonically than in Candide.

 source

“No law of nature, however general, has been established all at once; its recognition has always been preceded by many presentiments.”*…

Laws of nature are impossible to break, and nearly as difficult to define. Just what kind of necessity do they possess?

… The natural laws limit what can happen. They are stronger than the laws of any country because it is impossible to violate them. If it is a law of nature that, for example, no object can be accelerated from rest to beyond the speed of light, then it is not merely that such accelerations never occur. They cannot occur.

There are many things that never actually happen but could have happened in that their occurrence would violate no law of nature. For instance, to borrow an example from the philosopher Hans Reichenbach (1891-1953), perhaps in the entire history of the Universe there never was nor ever will be a gold cube larger than one mile on each side. Such a large gold cube is not impossible. It just turns out never to exist. It’s like a sequence of moves that is permitted by the rules of chess but never takes place in the entire history of chess-playing. By contrast, if it is a law of nature that energy is never created or destroyed, then it is impossible for the total energy in the Universe to change. The laws of nature govern the world like the rules of chess determine what is permitted and what is forbidden during a game of chess, in an analogy drawn by the biologist T H Huxley (1825-95).

Laws of nature differ from one another in many respects. Some laws concern the general structure of spacetime, while others concern some specific inhabitant of spacetime (such as the law that gold doesn’t rust). Some laws relate causes to their effects (as Coulomb’s law relates electric charges to the electric forces they cause). But other laws (such as the law of energy conservation or the spacetime symmetry principles) do not specify the effects of any particular sort of cause. Some laws involve probabilities (such as the law specifying the half-life of some radioactive isotope). And some laws are currently undiscovered – though I can’t give you an example of one of those! (By ‘laws of nature’, I will mean the genuine laws of nature that science aims to discover, not whatever scientists currently believe to be laws of nature.)

What all of the various laws have in common, despite their diversity, is that it is necessary that everything obey them. It is impossible for them to be broken. An object must obey the laws of nature…

But although all these truisms about the laws of nature sound plausible and familiar, they are also imprecise and metaphorical. The natural laws obviously do not ‘govern’ the Universe in the way that the rules of chess govern a game of chess. Chess players know the rules and so deliberately conform to them, whereas inanimate objects do not know the laws of nature and have no intentions.

Scientists discover laws of nature by acquiring evidence that some apparent regularity is not only never violated but also could never have been violated. For instance, when every ingenious effort to create a perpetual-motion machine turned out to fail, scientists concluded that such a machine was impossible – that energy conservation is a natural law, a rule of nature’s game rather than an accident. In drawing this conclusion, scientists adopted various counterfactual conditionals, such as that, even if they had tried a different scheme, they would have failed to create a perpetual-motion machine. That it is impossible to create such a machine (because energy conservation is a law of nature) explains why scientists failed every time they tried to create one.

Laws of nature are important scientific discoveries. Their counterfactual resilience enables them to tell us about what would have happened under a wide range of hypothetical circumstances. Their necessity means that they impose limits on what is possible. Laws of nature can explain why something failed to happen by revealing that it cannot happen – that it is impossible.

We began with several vague ideas that seem implicit in scientific reasoning: that the laws of nature are important to discover, that they help us to explain why things happen, and that they are impossible to break. Now we can look back and see that we have made these vague ideas more precise and rigorous. In doing so, we found that these ideas are not only vindicated, but also deeply interconnected. We now understand better what laws of nature are and why they are able to play the roles that science calls upon them to play.

What is a Law of Nature?,” Marc Lange explains in @aeonmag.

* Dmitri Mendeleev (creator of the Periodic Table)

###

As we study law, we might send inquisitive birthday greetings to Federico Cesi; he was born on this date in 1585. A scientist and naturalist, he is best remembered as the founder of the Accademia dei Lincei (Lincean Academy), often cited as the first modern scientific society. Cesi coined (or at least was first to publish/disseminate) the word “telescope” to denote the instrument used by Galileo– who was the sixth member of the Lincean Academy.

source

“My work consists of two parts; that presented here plus all I have not written. It is this second part that is important.”*…

Ludwig Wittgenstein’s wooden cabin in Skjolden, Norway

On the occasion of it centenary, Peter Salmon considers the history, context, and lasting significance of Wittgenstein‘s revolutionary first work…

One hundred years ago, a slim volume of philosophy was published by the then unknown Austrian philosopher Ludwig Wittgenstein. The book was as curious as its title, Tractatus Logico-Philosophicus. Running to only 75 pages, it was in the form of a series of propositions, the gnomic quality of the first as baffling to the newcomer today as it was then.

1. The world is all that is the case.
1.1 The world is a totality of facts not of things.
1.11 The world is determined by the facts, and by their being all the facts.
1.12 For the totality of facts determines what is the case, and also whatever is not the case.
1.13 The facts in logical space are the world.

And so on, through six propositions, 526 numbered statements, equally emphatic and enigmatic, until the seventh and final proposition, which stands alone at the end of the text: “Whereof we cannot speak, thereof we must remain silent.”

The book’s influence was to be dramatic and far-reaching. Wittgenstein believed he had found a “solution” to how language and the world relate, that they shared a logical form. This also set a limit as to what questions could be meaningfully asked. Any question which could not be verified was, in philosophical terms, nonsense.

Written in the First World War trenches, Tractatus is, in many ways, a work of mysticism…

Ludwig Wittgenstein’s Tractatus is as brilliant and baffling today as it was on its publication a century ago: “The logical mystic,” from @petesalmon in @NewHumanist.

* Ludwig Wittgenstein

###

As we wrestle with reason and reality, we might recall that it was on this date in 1930 that Dashiell Hammett‘s The Maltese Falcon— likely a favorite of Wittgenstein’s— was published. In 1990 the novel ranked 10th in Top 100 Crime Novels of All Time list by the Crime Writer’s Association. Five years later, in a similar list by Mystery Writers of America, the novel was ranked third.

source

“Your job as a scientist is to figure out how you’re fooling yourself”*…

Larger version here

And like scientists, so all of us…

Science has shown that we tend to make all sorts of mental mistakes, called “cognitive biases”, that can affect both our thinking and actions. These biases can lead to us extrapolating information from the wrong sources, seeking to confirm existing beliefs, or failing to remember events the way they actually happened!

To be sure, this is all part of being human—but such cognitive biases can also have a profound effect on our endeavors, investments, and life in general.

Humans have a tendency to think in particular ways that can lead to systematic deviations from making rational judgments.

These tendencies usually arise from:

• Information processing shortcuts

• The limited processing ability of the brain

• Emotional and moral motivations

• Distortions in storing and retrieving memories

• Social influence

Cognitive biases have been studied for decades by academics in the fields of cognitive science, social psychology, and behavioral economics, but they are especially relevant in today’s information-packed world. They influence the way we think and act, and such irrational mental shortcuts can lead to all kinds of problems in entrepreneurship, investing, or management.

Here are five examples of how these types of biases can affect people in the business world:

1. Familiarity Bias: An investor puts her money in “what she knows”, rather than seeking the obvious benefits from portfolio diversification. Just because a certain type of industry or security is familiar doesn’t make it the logical selection.

2. Self-Attribution Bias: An entrepreneur overly attributes his company’s success to himself, rather than other factors (team, luck, industry trends). When things go bad, he blames these external factors for derailing his progress.

3. Anchoring Bias: An employee in a salary negotiation is too dependent on the first number mentioned in the negotiations, rather than rationally examining a range of options.

4. Survivorship Bias: Entrepreneurship looks easy, because there are so many successful entrepreneurs out there. However, this is a cognitive bias: the successful entrepreneurs are the ones still around, while the millions who failed went and did other things.

5. Gambler’s Fallacy: A venture capitalist sees a portfolio company rise and rise in value after its IPO, far behind what he initially thought possible. Instead of holding on to a winner and rationally evaluating the possibility that appreciation could still continue, he dumps the stock to lock in the existing gains.

An aid to thinking about thinking: “Every Single Cognitive Bias in One Infographic.” From DesignHacks.co via Visual Capitalist.

And for a fascinating look of cognitive bias’ equally dangerous cousin, innumeracy, see here.

* Saul Perlmutter, astrophysicist, Nobel laureate

###

As we cogitate, we might recall that it was on this date in 1859 that “The Carrington Event” began. Lasting two days, it was the largest solar storm on record: a large solar flare (a coronal mass ejection, or CME) that affected many of the (relatively few) electronics and telegraph lines on Earth.

A solar storm of this magnitude occurring today would cause widespread electrical disruptions, blackouts, and damage due to extended outages of the electrical grid. The solar storm of 2012 was of similar magnitude, but it passed Earth’s orbit without striking the planet, missing by nine days. See here for more detail on what such a storm might entail.

Sunspots of 1 September 1859, as sketched by R.C. Carrington. A and B mark the initial positions of an intensely bright event, which moved over the course of five minutes to C and D before disappearing.
%d bloggers like this: