(Roughly) Daily

Posts Tagged ‘thinking

“I will buckle down to work as soon as I finish reading the Internet”*…

From Aldobrandino da Siena’s Le Régime du corps (1265-70 CE)

Worried that technology is “breaking your brain:? As Joe Stadolnik explains, fears about attention spans and focus are as old as writing itself…

If you suspect that 21st-century technology has broken your brain, it will be reassuring to know that attention spans have never been what they used to be. Even the ancient Roman philosopher Seneca the Younger was worried about new technologies degrading his ability to focus. Sometime during the 1st century CE, he complained that ‘The multitude of books is a distraction’. This concern reappeared again and again over the next millennia. By the 12th century, the Chinese philosopher Zhu Xi saw himself living in a new age of distraction thanks to the technology of print: ‘The reason people today read sloppily is that there are a great many printed texts.’ And in 14th-century Italy, the scholar and poet Petrarch made even stronger claims about the effects of accumulating books:

Believe me, this is not nourishing the mind with literature, but killing and burying it with the weight of things or, perhaps, tormenting it until, frenzied by so many matters, this mind can no longer taste anything, but stares longingly at everything, like Tantalus thirsting in the midst of water.

Technological advances would make things only worse. A torrent of printed texts inspired the Renaissance scholar Erasmus to complain of feeling mobbed by ‘swarms of new books’, while the French theologian Jean Calvin wrote of readers wandering into a ‘confused forest’ of print. That easy and constant redirection from one book to another was feared to be fundamentally changing how the mind worked. Apparently, the modern mind – whether metaphorically undernourished, harassed or disoriented –­ has been in no position to do any serious thinking for a long time.

In the 21st century, digital technologies are inflaming the same old anxieties… and inspiring some new metaphors…

Same as it ever was– a history of the anxieties about attention and memory that new communications technologies have occasioned through history: “We’ve always been distracted,” from @joestadolnik in @aeonmag.

* Stewart Brand @stewartbrand


As we learn our way into new media, we might recall that it was it was on this date in 1946 that the first first Washington, D.C. – New York City telecast was accomplished, using AT&T corporation’s coaxial cable; General Dwight Eisenhower was seen to place a wreath at the base of the statue in the Lincoln Memorial and others made brief speeches. The event was judged a success by engineers, although Time magazine called it “as blurred as an early Chaplin movie.”

1946 television (source)

Written by (Roughly) Daily

February 18, 2023 at 1:00 am

“The brain is a wonderful organ; it starts working the moment you get up in the morning and does not stop until you get into the office”*…

For as long as humans have thought, humans have thought about thinking. George Cave on the power and the limits of the metaphors we’ve used to do that…

For thousands of years, humans have described their understanding of intelligence with engineering metaphors. In the 3rd century BCE, the invention of hydraulics popularized the model of fluid flow (“humours”) in the body. This lasted until the 1500s, supplanted by the invention of automata and the idea of humans as complex machines. From electrical and chemical metaphors in the 1700s to advances in communications a century later, each metaphor reflected the most advanced thinking of that era. Today is no different: we talk of brains that store, process and retrieve memories, mirroring the language of computers.

I’ve always believed metaphors to be helpful and productive in communicating unfamiliar concepts. But this fascinating history of cognitive science metaphors shows that flawed metaphors can take hold and limit the scope for alternative ideas. In the worst case, the EU spent 10 years and $1.3 billion building a model of the brain based on the incorrect belief that the brain functions like a computer…

Thinking about thinking, from @George_Cave in @the_prepared.

Apposite: “Finding Language in the Brain.”

* Robert Frost


As we cogitate on cognition, we might send carefully-computed birthday greetings to Grace Brewster Murray Hopper.  A seminal computer scientist and Rear Admiral in the U.S. Navy, “Amazing Grace” (as she was known to many in her field) was one of the first programmers of the Harvard Mark I computer (in 1944), invented the first compiler for a computer programming language, and was one of the leaders in popularizing the concept of machine-independent programming languages– which led to the development of COBOL, one of the first high-level programming languages.

Hopper also (inadvertently) contributed one of the most ubiquitous metaphors in computer science: she found and documented the first computer “bug” (in 1947).

She has both a ship (the guided-missile destroyer USS Hopper) and a super-computer (the Cray XE6 “Hopper” at NERSC) named in her honor.


Written by (Roughly) Daily

December 9, 2022 at 1:00 am

“To achieve style, begin by affecting none”*…

The first issue of the Philosophical Transactions of the Royal Society

From Roger’s Bacon, in New Science, a brief history of scientific writing…

Since the founding of the first scientific journal in 1665, there have been calls to do away with stylistic elements in favor of clarity, concision, and precision.

In 1667, Thomas Sprat urged members of the Royal Society to “reject all the amplifications, digressions, and swellings of style; to return back to the primitive purity, and shortness, when men delivered so many things, almost in an equal number of words.” Some 200 years later, Charles Darwin said much the same: “I think too much pains cannot be taken in making the style transparently clear and throwing eloquence to the dogs” (Aaronson, 1977).

Darwin and Sprat eventually got their way. Modern scientific writing is homogenous, cookie-cutter, devoid of style. But scientific papers weren’t always like this.

Writing in The Last Word On Nothing blog, science journalist Roberta Kwok explains how old articles differ from their modern counterparts:

Scientists used to admit when they don’t know what the hell is going on.

When philosopher Pierre Gassendi tried to capture observations of Mercury passing in front of the Sun in 1631, he was beset by doubts:

“[T]hrown into confusion, I began to think that an ordinary spot would hardly pass over that full distance in an entire day. And I was undecided indeed… I wondered if perhaps I could not have been wrong in some way about the distance measured earlier.”

They get excited and use italics.

In 1892, a gentleman named William Brewster observed a bird called a northern shrike attacking a meadow mouse in Massachusetts. After tussling with its prey, he wrote, “[t]he Shrike now looked up and seeing me jumped on the mouse with both feet and flew off bearing it in its claws.”

They write charming descriptions.

Here’s French scientist Jean-Henri Fabre rhapsodizing about the emperor moth in his book, The Life of the Caterpillar (1916):

Who does not know the magnificent Moth, the largest in Europe, clad in maroon velvet with a necktie of white fur? The wings, with their sprinkling of grey and brown, crossed by a faint zig-zag and edged with smoky white, have in the centre a round patch, a great eye with a black pupil and a variegated iris containing successive black, white, chestnut and purple arcs.

All this to say: Scientists in the pre-modern era wrote freely, despite calls to do away with that freedom. At some point, narrative and literary styles vanished and were replaced with rigid formats and impoverished prose.  The question now is: Have we gone too far in removing artistry from scientific writing?

For a well-argued case that we have– “the way that we write is inseparable from the way that we think, and restrictions in one necessarily lead to restrictions in the other”– read on: “Research Papers Used to Have Style. What Happened?,” from @RogersBacon1 and @newscienceorg.

* E. B. White, The Elements of Style


As we ponder purposive prose, we might spare a thought for Johann Adam Schall von Bell; he died on this date in 1666. An expressive writer in both German and Chinese, he was an astronomer and Jesuit missionary to China who revised the Chinese calendar, translated Western astronomical books, and was head of Imperial Board of Astronomy (1644-64). Given the Chinese name “Tang Ruowang,” he became a trusted adviser (1644-61) to Emperor Shun-chih, first emperor of the Ch’ing dynasty (1644-1911/12), who made him a mandarin.


“Your job as a scientist is to figure out how you’re fooling yourself”*…

Larger version here

And like scientists, so all of us…

Science has shown that we tend to make all sorts of mental mistakes, called “cognitive biases”, that can affect both our thinking and actions. These biases can lead to us extrapolating information from the wrong sources, seeking to confirm existing beliefs, or failing to remember events the way they actually happened!

To be sure, this is all part of being human—but such cognitive biases can also have a profound effect on our endeavors, investments, and life in general.

Humans have a tendency to think in particular ways that can lead to systematic deviations from making rational judgments.

These tendencies usually arise from:

• Information processing shortcuts

• The limited processing ability of the brain

• Emotional and moral motivations

• Distortions in storing and retrieving memories

• Social influence

Cognitive biases have been studied for decades by academics in the fields of cognitive science, social psychology, and behavioral economics, but they are especially relevant in today’s information-packed world. They influence the way we think and act, and such irrational mental shortcuts can lead to all kinds of problems in entrepreneurship, investing, or management.

Here are five examples of how these types of biases can affect people in the business world:

1. Familiarity Bias: An investor puts her money in “what she knows”, rather than seeking the obvious benefits from portfolio diversification. Just because a certain type of industry or security is familiar doesn’t make it the logical selection.

2. Self-Attribution Bias: An entrepreneur overly attributes his company’s success to himself, rather than other factors (team, luck, industry trends). When things go bad, he blames these external factors for derailing his progress.

3. Anchoring Bias: An employee in a salary negotiation is too dependent on the first number mentioned in the negotiations, rather than rationally examining a range of options.

4. Survivorship Bias: Entrepreneurship looks easy, because there are so many successful entrepreneurs out there. However, this is a cognitive bias: the successful entrepreneurs are the ones still around, while the millions who failed went and did other things.

5. Gambler’s Fallacy: A venture capitalist sees a portfolio company rise and rise in value after its IPO, far behind what he initially thought possible. Instead of holding on to a winner and rationally evaluating the possibility that appreciation could still continue, he dumps the stock to lock in the existing gains.

An aid to thinking about thinking: “Every Single Cognitive Bias in One Infographic.” From DesignHacks.co via Visual Capitalist.

And for a fascinating look of cognitive bias’ equally dangerous cousin, innumeracy, see here.

* Saul Perlmutter, astrophysicist, Nobel laureate


As we cogitate, we might recall that it was on this date in 1859 that “The Carrington Event” began. Lasting two days, it was the largest solar storm on record: a large solar flare (a coronal mass ejection, or CME) that affected many of the (relatively few) electronics and telegraph lines on Earth.

A solar storm of this magnitude occurring today would cause widespread electrical disruptions, blackouts, and damage due to extended outages of the electrical grid. The solar storm of 2012 was of similar magnitude, but it passed Earth’s orbit without striking the planet, missing by nine days. See here for more detail on what such a storm might entail.

Sunspots of 1 September 1859, as sketched by R.C. Carrington. A and B mark the initial positions of an intensely bright event, which moved over the course of five minutes to C and D before disappearing.

I’m relatively sure that this is reassuring news…


image: Paul Wesley Griggs

The quantum world defies the rules of ordinary logic. Particles routinely occupy two or more places at the same time and don’t even have well-defined properties until they are measured. It’s all strange, yet true – quantum theory is the most accurate scientific theory ever tested and its mathematics is perfectly suited to the weirdness of the atomic world. (…)

Human thinking, as many of us know, often fails to respect the principles of classical logic. We make systematic errors when reasoning with probabilities, for example. Physicist Diederik Aerts of the Free University of Brussels, Belgium, has shown that these errors actually make sense within a wider logic based on quantum mathematics. The same logic also seems to fit naturally with how people link concepts together, often on the basis of loose associations and blurred boundaries. That means search algorithms based on quantum logic could uncover meanings in masses of text more efficiently than classical algorithms.

It may sound preposterous to imagine that the mathematics of quantum theory has something to say about the nature of human thinking. This is not to say there is anything quantum going on in the brain, only that “quantum” mathematics really isn’t owned by physics at all, and turns out to be better than classical mathematics in capturing the fuzzy and flexible ways that humans use ideas. “People often follow a different way of thinking than the one dictated by classical logic,” says Aerts. “The mathematics of quantum theory turns out to describe this quite well.” (…)

Why should quantum logic fit human behaviour? Peter Bruza at Queensland University of Technology in Brisbane, Australia, suggests the reason is to do with our finite brain being overwhelmed by the complexity of the environment yet having to take action long before it can calculate its way to the certainty demanded by classical logic. Quantum logic may be more suitable to making decisions that work well enough, even if they’re not logically faultless. “The constraints we face are often the natural enemy of getting completely accurate and justified answers,” says Bruza.

This idea fits with the views of some psychologists, who argue that strict classical logic only plays a small part in the human mind. Cognitive psychologist Peter Gardenfors of Lund University in Sweden, for example, argues that much of our thinking operates on a largely unconscious level, where thought follows a less restrictive logic and forms loose associations between concepts.

Aerts agrees. “It seems that we’re really on to something deep we don’t yet fully understand.” This is not to say that the human brain or consciousness have anything to do with quantum physics, only that the mathematical language of quantum theory happens to match the description of human decision-making.

Perhaps only humans, with our seemingly illogical minds, are uniquely capable of discovering and understanding quantum theory.

Read the article in its fascinating entirety at New Scientist (via Amira Skowmorowska’s Lapidarium Notes).


As we feel even more justified in agreeing with Emerson that “a foolish consistency is the hobgoblin of little minds,” we might recall that it was on this date in 1692 that Giles Corey, a prosperous farmer and full member of the church in early colonial America, died under judicial torture during the Salem witch trials.  Corey refused to enter a plea; he was crushed to death by stone weights in an attempt to force him to do so.

Under the law at the time, a person who refused to plead could not be tried. To avoid persons cheating justice, the legal remedy was “peine forte et dure“– a process in which the prisoner is stripped naked and placed prone under a heavy board. Rocks or boulders are then laid on the wood. Corey was the only person in New England to suffer this punishment, though Margaret Clitherow was similarly crushed in England in 1586 for refusing to plead to the charge of secretly practicing Catholicism.

Corey’s last words were “more weight.”




%d bloggers like this: