(Roughly) Daily

Posts Tagged ‘knowledge

“A nothing will serve just as well as a something about which nothing could be said”*…

Metaphysical debates in quantum physics don’t get at “truth,” physicist and mathematician Timothy Andersen argues; they’re nothing but a form of ritual activity and culture. After a thoughtful intellectual history of both quantum mechanics and Wittgenstein’s thought, he concludes…

If Wittgenstein were alive today, he might have couched his arguments in the vocabulary of cultural anthropology. For this shared grammar and these language games, in his view, form part of much larger ritualistic mechanisms that connect human activity with human knowledge, as deeply as DNA connects to human biology. It is also a perfect example of how evolution works by using pre-existing mechanisms to generate new behaviors.

The conclusion from all of this is that interpretation and representation in language and mathematics are little different than the supernatural explanations of ancient religions. Trying to resolve the debate between Bohr and Einstein is like trying to answer the Zen kōan about whether the tree falling in the forest makes a sound if no one can hear it. One cannot say definitely yes or no, because all human language must connect to human activity. And all human language and activity are ritual, signifying meaning by their interconnectedness. To ask what the wavefunction means without specifying an activity – and experiment – to extract that meaning is, therefore, as sensible as asking about the sound of the falling tree. It is nonsense.

As a scientist and mathematician, Wittgenstein has challenged my own tendency to seek out interpretations of phenomena that have no scientific value – and to see such explanations as nothing more than narratives. He taught that all that philosophy can do is remind us of what is evidently true. It’s evidently true that the wavefunction has a multiverse interpretation, but one must assume the multiverse first, since it cannot be measured. So the interpretation is a tautology, not a discovery.

I have humbled myself to the fact that we can’t justify clinging to one interpretation of reality over another. In place of my early enthusiastic Platonism, I have come to think of the world not as one filled with sharply defined truths, but rather as a place containing myriad possibilities – each of which, like the possibilities within the wavefunction itself, can be simultaneously true. Likewise, mathematics and its surrounding language don’t represent reality so much as serve as a trusty tool for helping people to navigate the world. They are of human origin and for human purposes.

To shut up and calculate, then, recognizes that there are limits to our pathways for understanding. Our only option as scientists is to look, predict and test. This might not be as glamorous an offering as the interpretations we can construct in our minds, but it is the royal road to real knowledge…

A provocative proposition: “Quantum Wittgenstein,” from @timcopia in @aeonmag.

* Ludwig Wittgenstein, Philosophical Investigations

###

As we muse on meaning, we might recall that it was on this date in 1954 that the official ground-breaking for CERN (Conseil européen pour la recherche nucléaire) was held. Located in Switzerland, it is the largest particle physics laboratory in the world… that’s to say, a prime spot to do the observation and calculation that Andersen suggests. Indeed, it’s been the site of many breakthrough discoveries over the years, maybe most notably the 2012 observation of the Higgs Boson.

Because researchers need remote access to these facilities, the lab has historically been a major wide area network hub. Indeed, it was at CERN that Tim Berners-Lee developed the first “browser”– and effectively fomented the emergence of the web.

CERN’s main site, from Switzerland looking towards France

“Behind the hieroglyphic streets there would either be a transcendent meaning, or only the earth”*…

Gerardo Dottori, Explosion of Red on Green, 1910, oil on canvas. London, Tate Modern. [source]

A crop of new books attempts to explain the allure of conspiracy theories and the power of belief; Trevor Quirk considers them…

For the millions who were enraged, disgusted, and shocked by the Capitol riots of January 6, the enduring object of skepticism has been not so much the lie that provoked the riots but the believers themselves. A year out, and book publishers confirmed this, releasing titles that addressed the question still addling public consciousness: How can people believe this shit? A minority of rioters at the Capitol had nefarious intentions rooted in authentic ideology, but most of them conveyed no purpose other than to announce to the world that they believed — specifically, that the 2020 election was hijacked through an international conspiracy — and that nothing could sway their confidence. This belief possessed them, not the other way around.

At first, I’d found the riots both terrifying and darkly hilarious, but those sentiments were soon overwon by a strange exasperation that has persisted ever since. It’s a feeling that has robbed me of my capacity to laugh at conspiracy theories — QAnon, chemtrails, lizardmen, whatever — and the people who espouse them. My exasperation is for lack of an explanation. I see Trump’s most devoted hellion, rampaging down the halls of power like a grade schooler after the bell, and I need to know the hidden causes of his dopey rebellion. To account for our new menagerie of conspiracy theories, I told myself, would be to reclaim the world from entropy, to snap experience neatly to the grid once again. I would use recent books as the basis for my account of conspiracy theories in the age of the internet. From their pages I would extract insights and errors like newspaper clippings, pin the marginal, bizarre, and seemingly irrelevant details to the corkboard of my mind, where I could spy eerie resonances, draw unseen connections. At last, I could reveal that our epistemic bedlam is as a Twombly canvas — messy but decipherable…

Learn with @trevorquirk: “Out There,” in @GuernicaMag.

* Thomas Pynchon, The Crying of Lot 49

###

As we tangle with truth, we might send rigorous birthday greetings to Gustav Bergmann; he was born on this date in 1906. A philosopher, he was a member of the Vienna Circle, a a group of philosophers and scientists drawn from the natural and social sciences, logic and mathematics, whose values were rooted in the ideals of the Enlightenment. Their approach, logical positivism, an attempt to use logic to make philosophy “scientific,” has had immense influence on 20th-century philosophy, especially on the philosophy of science and analytic philosophy… even if it has not, in fact, eliminated the issues explored above.

source

We might also send birthday greetings in the form of logical and semantic puzzles both to the precocious protagonist of Alice’s Adventures in Wonderland and to her inspiration, Alice Liddell; they were “born” on this date in 1852.

source

“The truth is rarely pure and never simple”*…

For a century, the idea of truth has been deflated, becoming terrain from which philosophers fled. Crispin Sartwell argues that they must return – urgently…

It is often said, rather casually, that truth is dissolving, that we live in the ‘post-truth era’. But truth is one of our central concepts – perhaps our most central concept – and I don’t think we can do without it. To believe that masks prevent the spread of COVID-19 is to take it to be true that they do. To assert it is to claim that it is true. Truth is, plausibly, central to thought and communication in every case. And, of course, it’s often at stake in practical political debates and policy decisions, with regard to climate change or vaccines, for example, or who really won the election, or whom we should listen to about what.

One might have hoped to turn to philosophy for a clarification of the nature of truth, and maybe even a celebration of it. But philosophy of pragmatist, analytic and continental varieties lurched into the post-truth era a century ago. If truth is a problem now for everyone, if the idea seems empty or useless in ‘the era of social media’, ‘science denialism’, ‘conspiracy theories’ and suchlike, maybe that just means that ‘everyone’ has caught up to where philosophy was in 1922…

[Sartwell sketches the last 100 years of philosophy, and it’s undermining of the very idea of truth.]

I don’t think, despite all the attacks on the notion by all sorts of philosophers for a good century, that we’re going to be able to do without truth. In a way, I don’t think all those attacks touched truth at all, which (we’re finding) is necessary, still the only possible cure…

As a first step… we might broaden the focus from the philosophical question of what makes a sentence or proposition true or false to focus on some of the rich ways the concept of truth functions in our discourse. That love is true does not mean that it is a representation that matches up to reality. It does not mean that the love hangs together with all the rest of the lover or lovee’s belief system. It doesn’t mean that the hypothesis that my love is true helps us resolve our problems (it might introduce more problems). It means that the love is intense and authentic, or, as I’d like to put it, that it is actual, real. That my aim is true does not indicate that my aim accurately pictures the external world, but that it thumps the actual world right in the centre, as it were.

Perhaps what is true or false isn’t only, or even primarily, propositions, but loves and aims, and the world itself. That is, I would like to start out by thinking of ‘true’ as a semi-synonym of ‘real’. If I were formulating in parallel to Aristotle, I might say that ‘What is, is true.’ And perhaps there’s something to be said for Heidegger’s ‘comportment’ after all: to know and speak the real requires a certain sort of commitment: a commitment to face reality. Failures of truth are, often, failures to face up. Now, I’m not sure how much that will help with mathematics, but maths needs to understand that it is only one among the many forms of human knowledge. We, or at any rate I, might hope that an account that addresses the traditional questions about propositional truth might emerge from this broader structure of understanding. That is speculative, I admit.

Truth may not be the eternal unchanging Form that Plato thought it was, but that doesn’t mean it can be destroyed by a few malevolent politicians, tech moguls or linguistic philosophers, though the tech moguls and some of the philosophers (David Chalmers, for instance) might be trying to undermine or invent reality, as well. Until they manage it, the question of truth is as urgent, or more urgent, than ever, and I would say that despite the difficulties, philosophers need to take another crack. Perhaps not at aletheia as a joy forever, but at truth as we find it, and need it, now…

On why philosophy needs to return of the question of truth: “Truth Is Real,” from @CrispinSartwell in @aeonmag.

Source of the image above, also relevant: “The difference between ‘Truth’ and ‘truth’.”

* Oscar Wilde

###

As we wrestle with reality, we might recall that it was on this date in 1986 that Geraldo Rivera opened “Al Capone’s Vault”…

Notorious and “most wanted” gangster, Al Capone, began his life of crime in Chicago in 1919 and had his headquarters set up at the Lexington Hotel until his arrest in 1931. Years later, renovations were being made at the hotel when a team of workers discovered a shooting-range and series of connected tunnels that led to taverns and brothels making for an easy escape should there be a police raid. Rumors were spread that Capone had a secret vault hidden under the hotel as well. In 1985, news reporter Geraldo Rivera had been fired from ABC after he criticized the network for canceling his report made about an alleged relationship between John F. Kennedy and Marilyn Monroe. It seemed like a good time for Rivera to scoop a new story to repair his reputation. It was on this day in 1986 that his live, two-hour, syndicated TV special, The Mystery of Al Capone’s Vault aired. After lots of backstory, the time finally came to reveal what was in that vault. It turned out to be empty. After the show, Rivera was quoted as saying “Seems like we struck out.”

source

source

Written by (Roughly) Daily

April 21, 2022 at 1:00 am

“With my tongue in one cheek only, I’d suggest that had our palaeolithic ancestors discovered the peer-review dredger, we would be still sitting in caves”*…

As a format, “scholarly” scientific communications are slow, encourage hype, and are difficult to correct. Stuart Ritchie argues that a radical overhaul of publishing could make science better…

… Having been printed on paper since the very first scientific journal was inaugurated in 1665, the overwhelming majority of research is now submitted, reviewed and read online. During the pandemic, it was often devoured on social media, an essential part of the unfolding story of Covid-19. Hard copies of journals are increasingly viewed as curiosities – or not viewed at all.

But although the internet has transformed the way we read it, the overall system for how we publish science remains largely unchanged. We still have scientific papers; we still send them off to peer reviewers; we still have editors who give the ultimate thumbs up or down as to whether a paper is published in their journal.

This system comes with big problems. Chief among them is the issue of publication bias: reviewers and editors are more likely to give a scientific paper a good write-up and publish it in their journal if it reports positive or exciting results. So scientists go to great lengths to hype up their studies, lean on their analyses so they produce “better” results, and sometimes even commit fraud in order to impress those all-important gatekeepers. This drastically distorts our view of what really went on.

There are some possible fixes that change the way journals work. Maybe the decision to publish could be made based only on the methodology of a study, rather than on its results (this is already happening to a modest extent in a few journals). Maybe scientists could just publish all their research by default, and journals would curate, rather than decide, which results get out into the world. But maybe we could go a step further, and get rid of scientific papers altogether…

A bold proposal: “The big idea: should we get rid of the scientific paper?,” from @StuartJRitchie in @guardian.

Apposite (if only in its critical posture): “The Two Paper Rule.” See also “In what sense is the science of science a science?” for context.

Zygmunt Bauman

###

As we noodle on knowledge, we might recall that it was on this date in 1964 that AT&T connected the first Picturephone call (between Disneyland in California and the World’s Fair in New York). The device consisted of a telephone handset and a small, matching TV, which allowed telephone users to see each other in fuzzy video images as they carried on a conversation. It was commercially-released shortly thereafter (prices ranged from $16 to $27 for a three-minute call between special booths AT&T set up in New York, Washington, and Chicago), but didn’t catch on.

source

“A mind that is stretched by a new idea can never go back to its original dimensions”*…

Alex Berezow observes (in an appreciation of Peter AtkinsGalileo’s Finger: The Ten Great Ideas of Science) that, while scientific theories are always being tested, scrutinized for flaws, and revised, there are ten concepts so durable that it is difficult to imagine them ever being replaced with something better…

In his book The Structure of Scientific Revolutions, Thomas Kuhn argued that science, instead of progressing gradually in small steps as is commonly believed, actually moves forward in awkward leaps and bounds. The reason for this is that established theories are difficult to overturn, and contradictory data is often dismissed as merely anomalous. However, at some point, the evidence against the theory becomes so overwhelming that it is forcefully displaced by a better one in a process that Kuhn refers to as a “paradigm shift.” And in science, even the most widely accepted ideas could, someday, be considered yesterday’s dogma.

Yet, there are some concepts which are considered so rock solid, that it is difficult to imagine them ever being replaced with something better. What’s more, these concepts have fundamentally altered their fields, unifying and illuminating them in a way that no previous theory had done before…

The bedrock of modern biology, chemistry, and physics: “The ten greatest ideas in the history of science,” from @AlexBerezow in @bigthink.

* Oliver Wendell Holmes

###

As we forage for first principles, we might send carefully-calcuated birthday greetings to Georgiy Antonovich Gamov; he was born on this date in 1904. Better known by the name he adopted on immigrating to the U.S., George Gamow, he was a physicist and cosmologist whose early work was instrumental in developing the Big Bang theory of the universe; he also developed the first mathematical model of the atomic nucleus. In 1954, he expanded his interests into biochemistry and his work on deoxyribonucleic acid (DNA) made a basic contribution to modern genetic theory.

But mid-career Gamow began to shift his energy to teaching and to writing popular books on science… one of which, One Two Three… Infinity, inspired legions of young scientists-to-be and kindled a life-long interest in science in an even larger number of other youngsters (including your correspondent).

source

%d bloggers like this: