(Roughly) Daily

Posts Tagged ‘humility

“I set records that will never be equaled, 90% I hope are never printed”*…

The Inimitable Ray Ratto in the always-illuminating Defector on the legend that was Bob Uecker, who died, ten days shy of his 91st birthday, on January 16…

It is not difficult to find people who loved Bob Uecker unreservedly. There is a good reason for this: He was the last genuinely and instinctively funny person in baseball history. He was the god of Milwaukee even if you include all the Green Bay Packers going back to before Vince Lombardi. He was the face and voice of baseball cinema, the man whose line-reading made “Ju-u-u-u-st a bit outside” so good that “iconic” doesn’t remotely cover its impact. Even if you’re not a seamhead, you likely came across Bob Uecker and smiled.

So Thursday’s announcement that Uecker has died at age 90, due to small cell lung cancer, came as a blow. Nine decades is a good long run, but there was never a sense that he was running out of material; Uecker was still a joy to hear on Brewers broadcasts even in Year 54 of being the voice of Wisconsin baseball for two-and-a-half generations. The reaction to his passing was unanimous in the same ways and for the same reasons that the response to Vin Scully’s death was unanimous—it was an outpouring of both sadness at the loss and gratitude for all the time we got to spend with him. In an epochally angry time in America, at a moment when it isn’t hard to find even anti-puppy polemics with a keystroke, Uecker gets a pass from most everyone. Yes he defined baseball, but he also managed to become more than merely Mister Baseball. From the moment of his first appearance on Johnny Carson’s definitive version of the Tonight Show, which Uecker earned merely by mastering the tripartite arts of comedy writing, unabashed self-deprecation, and martini-dry humor, he was recognizable as that rarest of Americans, the guy you’d sit back down to listen to even if you were already halfway out the door. Put another way, Norm Macdonald thought he was one of the funniest men he ever met. Beat that with a stick.

He did WrestleManias. He starred in a not-entirely-forgettable sitcom, Mr. Belvedere. He did Major League and Homeward Bound; he did Puppy Dog Pals and Futurama. He did beer commercials that didn’t make you want to hurl bricks through your appliances. He did a Hall of Fame speech that exceeds all other such orations by a significant margin because he’d had years to perfect it, even though his playing career was its direct antithesis. Uecker was skilled enough to remain in the major leagues for six years, and smart enough to parlay a lifetime batting average of .19973—not .200, which is what his Baseball Reference page reads—into a career. One-ninety-nine-and-change. He managed to make that his calling card until he could replace it with his far superior ability to help others enjoy their day.

Like Scully, Uecker passed through life with almost no detractors, and not just because his methodology was to beat everyone to the detracting. He gave off an aura of knowing who he was and was not. Just as important, he knew where he was and was not. Uecker was more than content with being the definition of Milwaukee, staying in the Brewers’ radio booth for more than half a century, all while doing the movies and ads and national broadcasts that somehow made him even more Milwaukee-centric. It was an act of rooting to the ground that is almost unheard of today. Uecker didn’t put a shine on the Brewers when they didn’t deserve one, but he didn’t kick them when they were down, either. That’s because he wasn’t parlaying the Brewers gig into some better team in a bigger city. Mostly, he was revered for never leaving town even when Los Angeles might have been more logistically favorable. Who knew that being grounded was the best way to fly?

There will be flurries of other tributes in the next few days, from all corners; Uecker covered a great deal of ground despite starting his public life in a perpetual squat. Unlike nearly every other athlete of his era, Uecker was actually far better and more popular at everything that wasn’t athletics-based. He got 90 years out of this simple-to-explain-and-hard-to-deliver formula, and he succeeded less by luck or planning and more by simply being what he was—the guy who made everyone happier by the simple act of entering their space…

Bob Uecker Was Just the Best” (gift article) from @rayratto.bsky.social in @defector.bsky.social.

* Bob Uecker, in his Hall of Fame speech

###

As we honor the authentic, we might spare a thought for Henry Louis “Hank” Aaron; he died on this date in 2021. Considered one of the greatest Major League baseball players in history, he spent 21 seasons with the Milwaukee/Atlanta Braves in the National League (NL) and two seasons with the Milwaukee Brewers in the American League (AL). At the time of his retirement, Aaron held most of the game’s key career power-hitting records. He broke the long-standing MLB record for career home runs held by Babe Ruth and remained the career leader for 33 years, until Barry Bonds surpassed his famous total of 755 in 2007. He hit 24 or more home runs every year from 1955 through 1973 and is one of only two players to hit 30 or more home runs in a season at least fifteen times.

Aaron and Uecker were teammates in Milwaukee…

Written by (Roughly) Daily

January 22, 2025 at 1:00 am

“I was conscious that I knew practically nothing”*…

The estimable Nicholas Carr observes that “you don’t make friends by telling people they’re not as smart as they think they are. And you definitely don’t make friends by telling all of humanity that it’s not as smart as it thinks it is. That’s why the philosophical school of Mysterianism has never caught on with the public.” As an amateur Mysterian himself, he reprises a 2017 essay to spread the good word…

By leaps, steps, and stumbles, science progresses. Its seemingly inexorable advance promotes a sense that everything can be known and will be known. Through observation and experiment, and lots of hard thinking, we will come to explain even the murkiest and most complicated of nature’s secrets: consciousness, dark matter, time, the origin and fate of the universe.

But what if our faith in nature’s knowability is just an illusion, a trick of the overconfident human mind? That’s the working assumption behind a school of thought known as Mysterianism. Situated at the fruitful if sometimes fraught intersection of scientific and philosophic inquiry, the Mysterianist view has been promulgated, in different ways, by many prominent thinkers, from the philosopher Colin McGinn to the linguist Noam Chomsky to the cognitive scientist Steven Pinker. The Mysterians propose that human intellect has boundaries and that many of the mysteries of the cosmos will forever lie beyond our comprehension.

Mysterianism is most closely associated with the so-called hard problem of consciousness: How can the inanimate matter of the brain produce subjective feelings? The Mysterians suggest that the human mind is incapable of understanding itself, that we will never know how consciousness works. But if Mysterianism applies to the workings of the mind, there’s no reason it shouldn’t also apply to the workings of nature in general. As McGinn has suggested, “It may be that nothing in nature is fully intelligible to us.”

The simplest and best argument for Mysterianism is founded on evolutionary evidence. When we examine any other living creature, we understand immediately that its intellect is limited. Even the brightest, most curious dog is not going to master arithmetic. Even the wisest of owls knows nothing of the physiology of the field mouse it devours. If all the minds that evolution has produced have bounded comprehension, then it’s only logical that our own minds, also products of evolution, would have limits as well. As Pinker has put it, “The brain is a product of evolution, and just as animal brains have their limitations, we have ours.” To assume that there are no limits to human understanding is to believe in a level of human exceptionalism that seems miraculous, if not mystical.

Mysterianism, it’s important to emphasize, is not inconsistent with materialism [with theism or idealism]. The Mysterians don’t suggest that what’s unknowable has to be spiritual or otherwise otherworldly. They posit that matter itself has complexities that lie beyond our ken. Like every other animal on earth, we humans are just not smart enough to understand all of nature’s laws and workings.

What’s truly disconcerting about Mysterianism is that, if our intellect is bounded, we can never know how much of existence lies beyond our grasp. What we know or may in the future know may be trifling compared with the unknowable unknowns. “As to myself,” remarked Isaac Newton in his old age, “I seem to have been only like a boy playing on the sea-shore, and diverting myself in now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me.” It may be that we are all like that child on the strand, playing with the odd pebble or shell — and fated to remain so.

Mysterianism teaches us humility. Through science, we have come to understand much about nature, but much more may remain outside the scope of our perception and comprehension. If the Mysterians are right, science’s ultimate achievement may be to reveal to us its own limits…

On unknowable unknowns: Question Marks of the Mysterians, from @roughtype in his terrific newsletter, New Cartographies.

Pair with Flatland (here and here) and Godel’s second incompleteness theorem.

* Socrates (per Plato in Apology 22d)

###

As we wonder, we might recall that it was on this date (tough different sources offer different November dates) in 1966 that 96 Tears, the debut studio album by the American garage rock band ? and the Mysterians was released. The title single, which had been released some months earlier was at #1 on the Billboard Hot 100; the album joined the single on the charts for fifteen weeks; the follow-up single “I Need Somebody” charted for ten weeks.

“Crossing the river by feeling the stones”*…

How to live in our complex world? Samuel Arbesman on Incremental Humanism…

… there is a decent amount of contingency in the paths that technological innovation take:

…if we replayed the tape of human history, we would find that the sequence, timing, and (sometimes significant) details of inventions could be quite different, but that the main technological paradigms we discovered would also be discovered there. We would find steam power, electricity, plastics, and digital computers. But we wouldn’t find qwerty keyboards; we might not find keyboards at all. It’s tough to quantify this kind of thing in any meaningful way, and of course we can never know for sure, but my suspicion is that the technology of an alternate history of humans would look about as different from our own as the flora and fauna of Central Asia look from the flora and fauna of the central USA.

So when it comes to innovation, we forever live behind a Veil of Progress. This Veil prevents us from not only understanding the possible positive visions of the future that might win out, but even grasping how different technologies might recombine for further innovation. There is a certain fogginess towards the innovative future that we live within…

As per Kenneth Stanley and Joel Lehman in their book Why Greatness Cannot Be Planned… in a high-dimensional search space, aiming towards an objective will not work. Instead, it is best to develop novel stepping stones that can be productively recombined. This expanding of the adjacent possible is a much more effective strategy.

So how should we operate if we are constantly living behind the Veil of Progress? It requires humility and incremental tinkering.

The idea of humanism consists of, according to Sarah Bakewell, “free thinking, inquiry and hope.” But there are also other facets, from a sensibility of moderation, to a focus on improving the world.

I think incrementalism is also a key feature of humanism. As Adam Gopnik noted in his book A Thousand Small Sanities about liberalism: “Whenever we look at how the big problems got solved, it was rarely a big idea that solved them. It was the intercession of a thousand small sanities.”

This approach, of incremental humanism, is also a necessary part of the ideals of progress. Imagining a better future and incrementally improving towards this, even in an undirected manner, is the way of managing the veil of progress. As Rabbi Tarfon noted in the Talmud, “It is not your duty to finish the work, but neither are you at liberty to neglect it.” We are part of a long chain of improvements, all part of a tech tree that we can’t see and which involves a balance of innovation and maintenance (for we must preserve what we already have if we hope to be able to build on what has come before us). Revolution is the quick bandage that sounds appealing, but don’t be led to think it will necessarily result in enduring change. Big ideas can be seductive, but incremental change is the only way to live under uncertainty.

Living in a complex world where one’s impact is difficult to fully know requires an incremental humanism. This means having a vision of the future, but a more gradual and piecemeal one. This also means having a certain amount of long-term humility…

How to face the future: “Living with the Veil of Progress,” from @arbesman.

Chen Yun, via Deng Xiaoping

###

As we feel our ways, we might recall that it was on this date in 1961 that Decca Record released “I Fall to Pieces,” written by Hank Cochran and Harlan Howard and performed by the inimitable Patsy Cline. It started slow, but became Billboard‘s “Song of the Year” and has since, of course, become a classic.

Written by (Roughly) Daily

January 30, 2024 at 1:00 am

“Criticism may not be agreeable, but it is necessary. It fulfills the same function as pain in the human body. It calls attention to an unhealthy state of things.”*…

The estimable Henry Farrell on why, on average, we’re better at criticizing others than thinking originally ourselves…

… our individual reasoning processes are biased in ways that are really hard for us (individually) to correct. We have a strong tendency to believe our own bullshit. The upside is that if we are far better at detecting bullshit in others than in ourselves, and if we have some minimal good faith commitment to making good criticisms, and entertaining good criticisms when we get them, we can harness our individual cognitive biases through appropriate group processes to produce socially beneficial ends. Our ability to see the motes in others’ eyes while ignoring the beams in our own can be put to good work, when we criticize others and force them to improve their arguments. There are strong benefits to collective institutions that underpin a cognitive division of labor.

This superficially looks to resemble the ‘overcoming bias’/’not wrong’ approaches to self-improvement that are popular on the Internet. But it ends up going in a very different direction: collective processes of improvement rather than individual efforts to remedy the irremediable. The ideal of the individual seeking to eliminate all sources of bias so that he (it is, usually, a he) can calmly consider everything from a neutral and dispassionate perspective is replaced by a Humean recognition that reason cannot readily be separated from the desires of the reasoner. We need negative criticisms from others, since they lead us to understand weaknesses in our arguments that we are incapable of coming at ourselves, unless they are pointed out to us…

… It’s not about a radical individual virtuosity, but a radical individual humility. Your most truthful contributions to collective reasoning are unlikely to be your own individual arguments, but your useful criticisms of others’ rationales. Even more pungently, you are on average best able to contribute to collective understanding through your criticisms of those whose perspectives are most different to your own, and hence very likely those you most strongly disagree with. The very best thing that you may do in your life is create a speck of intense irritation for someone whose views you vigorously dispute, around which a pearl of new intelligence may then accrete…

… One of my favourite passages from anywhere is the closing of Middlemarch, where Eliot says of Dorothea:

“Her full nature, like that river of which Cyrus broke the strength, spent itself in channels which had no great name on the earth. But the effect of her being on those around her was incalculably diffusive: for the growing good of the world is partly dependent on unhistoric acts; and that things are not so ill with you and me as they might have been, is half owing to the number who lived faithfully a hidden life, and rest in unvisited tombs.”

Striving to be a Dorothea is a noble vocation, and likely the best we can hope for in any event; sooner or later, we will all be forgotten. In the long course of time, all of our arguments and ideas will be broken down and decomposed. At best we may hope, if we are very lucky, that they will contribute in some minute way to a rich humus, from which plants that we will never see or understand might spring.

Eminently worth reading in full: “In praise of negativity,” from @henryfarrell.

* Winston Churchill

###

As we contemplate the constructive, we might recall that it was on this date in 1871 that a discipline wholly dependent on incorporating corrective critique into its methods was founded:  Cleveland Abbe became the founding chief scientist– effectively the head– of the newly formed U.S. Weather Service (later named the Weather Bureau; later still, the National Weather Service). 

Abbe had started the first private weather reporting and warning service (in Cincinnati) and had been issuing weather reports or bulletins since 1869 and was the only person in the country at the time who was experienced in drawing weather maps from telegraphic reports and forecasting from them. The first U.S. meteorologist, he is known as the “father of the U.S. Weather Bureau,” where he systemized observation, trained personnel, and established scientific methods.  He went on to become one of the 33 founders of the National Geographic Society.

source

“It is well to remember that the entire universe, with one trifling exception, is composed of others”*…

This artist’s impression shows the temperate planet Ross 128 b, with its red dwarf parent star in the background. Credit: ESO/M. Kornmesser

For centuries, scientific discoveries have suggested humanity occupies no privileged place in the universe. But as Mario Livio argues, studies of worlds beyond our solar system could place meaningful new limits on our existential mediocrity…

When the Polish polymath Nicolaus Copernicus proposed in 1543 that the sun, rather than the Earth, was the center of our solar system, he did more than resurrect the “heliocentric” model that had been devised (and largely forgotten) some 18 centuries earlier by the Greek astronomer Aristarchus of Samos. Copernicus—or, rather, the “Copernican principle” that bears his name—tells us that we humans are nothing special. Or, at least, that the planet on which we live is not central to anything outside of us; instead, it’s just another ordinary world revolving around a star.

Our apparent mediocrity has only ascended in the centuries that have passed since Copernicus’s suggestion. In the middle of the 19th century Charles Darwin realized that rather than being the “crown of creation,” humans are simply a natural product of evolution by means of natural selection. Early in the 20th century, astronomer Harlow Shapley deepened our Copernican cosmic demotion, showing that not only the Earth but the whole solar system lacks centrality, residing in the Milky Way’s sleepy outer suburbs rather than the comparatively bustling galactic center. A few years later, astronomer Edwin Hubble showed that galaxies other than the Milky Way exist, and current estimates put the total number of galaxies in the observable universe at a staggering trillion or more.

Since 1995 we have discovered that even within our own Milky Way roughly one of every five sunlike or smaller stars harbors an Earth-size world orbiting in a “Goldilocks” region (neither too hot nor too cold) where liquid water may persist on a rocky planetary surface. This suggests there are at least a few hundred million planets in the Milky Way alone that may in principle be habitable. In roughly the same span of time, observations of the big bang’s afterglow—the cosmic microwave background—have shown that even the ordinary atomic matter that forms planets and people alike constitutes no more than 5 percent of the cosmic mass and energy budget. With each advance in our knowledge, our entire existence retreats from any possible pinnacle, seemingly reduced to flotsam adrift at the universe’s margins.

Believe it or not, the Copernican principle doesn’t even end there. In recent years increasing numbers of physicists and cosmologists have begun to suspect—often against their most fervent hopes—that our entire universe may be but one member of a mind-numbingly huge ensemble of universes: a multiverse.

Interestingly though, if a multiverse truly exists, it also suggests that Copernican cosmic humility can only be taken so far.

The implications of the Copernican principle may sound depressing to anyone who prefers a view of the world regarding humankind as the central or most important element of existence, but notice that every step along the way in extending the Copernican principle represented a major human discovery. That is, each decrease in the sense of our own physical significance was the result of a huge expansion in our knowledge. The Copernican principle teaches us humility, yes, but it also reminds us to keep our curiosity and passion for exploration alive and vibrant…

Fascinating: “How Far Should We Take Our Cosmic Humility?“, from @Mario_Livio in @sciam.

* John Holmes (the poet)

###

As we ponder our place, we might send carefully-observed birthday greetings to Arno Penzias; he was born on this date in 1933. A physicist and radio astronomer, he and Robert Wilson, a collegue at Bell Labs, discovered the cosmic microwave background radiation, which helped establish the Big Bang theory of cosmology– work for which they shared the 1978 Nobel Prize in Physics.

MB radiation is something that anyone old enough to have watched broadcast (that’s to say, pre-cable/streaming) television) has seen:

The way a television works is relatively simple. A powerful electromagnetic wave is transmitted by a tower, where it can be received by a properly sized antenna oriented in the correct direction. That wave has additional signals superimposed atop it, corresponding to audio and visual information that had been encoded. By receiving that information and translating it into the proper format (speakers for producing sound and cathode rays for producing light), we were able to receive and enjoy broadcast programming right in the comfort of our own homes for the first time. Different channels broadcasted at different wavelengths, giving viewers multiple options simply by turning a dial.

Unless, that is, you turned the dial to channel 03.

Channel 03 was — and if you can dig up an old television set, still is — simply a signal that appears to us as “static” or “snow.” That “snow” you see on your television comes from a combination of all sorts of sources:

– human-made radio transmissions,

– the Sun,

– black holes,

– and all sorts of other directional astrophysical phenomena like pulsars, cosmic rays and more.

But if you were able to either block all of those other signals out, or simply took them into account and subtracted them out, a signal would still remain. It would only by about 1% of the total “snow” signal that you see, but there would be no way of removing it. When you watch channel 03, 1% of what you’re watching comes from the Big Bang’s leftover glow. You are literally watching the cosmic microwave background…

This Is How Your Old Television Set Can Prove The Big Bang