Posts Tagged ‘reality’
“My work consists of two parts; that presented here plus all I have not written. It is this second part that is important.”*…
On the occasion of it centenary, Peter Salmon considers the history, context, and lasting significance of Wittgenstein‘s revolutionary first work…
One hundred years ago, a slim volume of philosophy was published by the then unknown Austrian philosopher Ludwig Wittgenstein. The book was as curious as its title, Tractatus Logico-Philosophicus. Running to only 75 pages, it was in the form of a series of propositions, the gnomic quality of the first as baffling to the newcomer today as it was then.
1. The world is all that is the case.
1.1 The world is a totality of facts not of things.
1.11 The world is determined by the facts, and by their being all the facts.
1.12 For the totality of facts determines what is the case, and also whatever is not the case.
1.13 The facts in logical space are the world.And so on, through six propositions, 526 numbered statements, equally emphatic and enigmatic, until the seventh and final proposition, which stands alone at the end of the text: “Whereof we cannot speak, thereof we must remain silent.”
The book’s influence was to be dramatic and far-reaching. Wittgenstein believed he had found a “solution” to how language and the world relate, that they shared a logical form. This also set a limit as to what questions could be meaningfully asked. Any question which could not be verified was, in philosophical terms, nonsense.
Written in the First World War trenches, Tractatus is, in many ways, a work of mysticism…
Ludwig Wittgenstein’s Tractatus is as brilliant and baffling today as it was on its publication a century ago: “The logical mystic,” from @petesalmon in @NewHumanist.
* Ludwig Wittgenstein
###
As we wrestle with reason and reality, we might recall that it was on this date in 1930 that Dashiell Hammett‘s The Maltese Falcon— likely a favorite of Wittgenstein’s— was published. In 1990 the novel ranked 10th in Top 100 Crime Novels of All Time list by the Crime Writer’s Association. Five years later, in a similar list by Mystery Writers of America, the novel was ranked third.
“Reality is frequently inaccurate”*…
Machine learning and what it may teach us about reality…
Our latest paradigmatic technology, machine learning, may be revealing the everyday world as more accidental than rule-governed. If so, it will be because machine learning gains its epistemological power from its freedom from the sort of generalisations that we humans can understand or apply.
The opacity of machine learning systems raises serious concerns about their trustworthiness and their tendency towards bias. But the brute fact that they work could be bringing us to a new understanding and experience of what the world is and our role in it…
The world is a black box full of extreme specificity: it might be predictable but that doesn’t mean it is understandable: “Learn from Machine Learning,” by David Weinberger (@dweinberger) in @aeonmag.
(image above: source)
* Douglas Adams, The Restaurant at the End of the Universe
###
As ruminate on the real, we might send carefully-computed birthday greetings to Grace Brewster Murray Hopper. A seminal computer scientist and Rear Admiral in the U.S. Navy, “Amazing Grace” (as she was known to many in her field) was one of the first programmers of the Harvard Mark I computer (in 1944), invented the first compiler for a computer programming language, and was one of the leaders in popularizing the concept of machine-independent programming languages– which led to the development of COBOL, one of the first high-level programming languages.
Hopper also found and documented the first computer “bug” (in 1947).
She has both a ship (the guided-missile destroyer USS Hopper) and a super-computer (the Cray XE6 “Hopper” at NERSC) named in her honor.

“It can be argued that in trying to see behind the formal predictions of quantum theory we are just making trouble for ourselves”*…
Context, it seems, is everthing…
… What is reality? Nope. There’s no way we are going through that philosophical minefield. Let’s focus instead on scientific realism, the idea that a world of things exists independent of the minds that might perceive it and it is the world slowly revealed by progress in science. Scientific realism is the belief that the true nature of reality is the subject of scientific investigation and while we may not completely understand it at any given moment, each experiment gets us a little bit closer. This is a popular philosophical position among scientists and science enthusiasts.
A typical scientific realist might believe, for example, that fundamental particles exist even though we cannot perceive them directly with our senses. Particles are real and their properties — whatever they may be — form part of the state of the world. A slightly more extreme view is that this state of the world can be specified with mathematical quantities and these, in turn, obey equations we call physical laws. In this view, the ultimate goal of science is to discover these laws. So what are the consequences of quantum physics on these views?
As I mentioned above, quantum physics is not a realistic model of the world — that is, it does not specify quantities for states of the world. An obvious question is then can we supplement or otherwise replace quantum physics with a deeper set of laws about real states of the world? This is the question Einstein first asked with colleagues Podolski and Rosen, making headlines in 1935. The hypothetical real states of the world came to be called hidden variables since an experiment does not reveal them — at least not yet.
In the decades that followed quantum physics rapidly turned into applied science and the textbooks which became canon demonstrated only how to use the recipes of quantum physics. In textbooks that are still used today, no mention is made of the progress in the foundational aspects of quantum physics since the mathematics was cemented almost one hundred years ago. But, in the 1960s, the most important and fundamental aspect of quantum physics was discovered and it put serious restrictions on scientific realism. Some go as far as to say the entire nature of independent reality is questionable due to it. What was discovered is now called contextuality, and its inevitability is referred to as the Bell-Kochen-Specker theorem.
John Bell is the most famous of the trio Bell, Kochen, and Specker, and is credited with proving that quantum physics contained so-called nonlocal correlations, a consequence of quantum entanglement. Feel free to read about those over here.
It was Bell’s ideas and notions that stuck and eventually led to popular quantum phenomena such as teleportation. Nonlocality itself is wildly popular these days in science magazines with reported testing of the concept in delicately engineered experiments that span continents and sometimes involve research satellites. But nonlocality is just one type of contextuality, which is the real game in town.
In the most succinct sentence possible, contextuality is the name for the fact that any real states of the world giving rise to the rules of quantum physics must depend on contexts that no experiment can distinguish. That’s a lot to unpack. Remember that there are lots of ways to prepare the same experiment — and by the same experiment, I mean many different experiments with completely indistinguishable results. Doing the exact same thing as yesterday in the lab, but having had a different breakfast, will give the same experimental results. But there are things in the lab and very close to the system under investigation that don’t seem to affect the results either. An example might be mixing laser light in two different ways.
There are different types of laser light that, once mixed together, are completely indistinguishable from one another no matter what experiments are performed on the mixtures. You could spend a trillion dollars on scientific equipment and never be able to tell the two mixtures apart. Moreover, knowing only the resultant mixture — and not the way it was mixed — is sufficient to accurately predict the outcomes of any experiment performed with the light. So, in quantum physics, the mathematical theory has a variable that refers to the mixture and not the way the mixture was made — it’s Occam’s razor in practice.
Now let’s try to invent a deeper theory of reality underpinning quantum physics. Surely, if we are going to respect Occam’s razor, the states in our model should only depend on contexts with observable consequences, right? If there is no possible experiment that can distinguish how the laser light is mixed, then the underlying state of reality should only depend on the mixture and not the context in which it was made, which, remember, might include my breakfast choices. Alas, this is just not possible in quantum physics — it’s a mathematical impossibility in the theory and has been confirmed by many experiments.
So, does this mean the universe cares about what I have for breakfast? Not necessarily. But, to believe the universe doesn’t care what I had for breakfast means you must also give up reality. You may be inclined to believe that when you observe something in the world, you are passively looking at it just the way it would have been had you not been there. But quantum contextuality rules this out. There is no way to define a reality that is independent of the way we choose to look at it…
“Why is no one taught the one concept in quantum physics which denies reality?” It’s called contextuality and it is the essence of quantum physics. From Chris Ferrie (@csferrie).
* “It can be argued that in trying to see behind the formal predictions of quantum theory we are just making trouble for ourselves. Was not precisely this the lesson that had to be learned before quantum mechanics could be constructed, that it is futile to try to see behind the observed phenomena?” – John Stewart Bell
###
As still we try, we might relatively hearty birthday greetings to Sir Marcus Laurence Elwin “Mark” Oliphant; he was born on this date in 1901. An Australian physicist who trained and did much of his work in England (where he studied under Sir Ernest Rutherford at the University of Cambridge’s Cavendish Laboratory), Oliphant was deeply involved in the Allied war effort during World War II. He helped develop microwave radar, and– by helping to start the Manhattan Project and then working with his friend Ernest Lawrence at the Radiation Laboratory in Berkeley, California, helped develop the atomic bomb.
After the war, Oliphant returned to Australia as the first director of the Research School of Physical Sciences and Engineering at the new Australian National University (ANU); on his retirement, he became Governor of South Australia and helped found the Australian Democrats political party.
“Doubtless we cannot see that other higher Spaceland now, because we have no eye in our stomachs”*…

An ” Amplituhedron“, an illustration of multi-dimensional spacetime
Our architecture, our education and our dictionaries tell us that space is three-dimensional. The OED defines it as ‘a continuous area or expanse which is free, available or unoccupied … The dimensions of height, depth and width, within which all things exist and move.’ In the 18th century, Immanuel Kant argued that three-dimensional Euclidean space is an a priori necessity and, saturated as we are now in computer-generated imagery and video games, we are constantly subjected to representations of a seemingly axiomatic Cartesian grid. From the perspective of the 21st century, this seems almost self-evident.
Yet the notion that we inhabit a space with any mathematical structure is a radical innovation of Western culture, necessitating an overthrow of long-held beliefs about the nature of reality. Although the birth of modern science is often discussed as a transition to a mechanistic account of nature, arguably more important – and certainly more enduring – is the transformation it entrained in our conception of space as a geometrical construct.
Over the past century, the quest to describe the geometry of space has become a major project in theoretical physics, with experts from Albert Einstein onwards attempting to explain all the fundamental forces of nature as byproducts of the shape of space itself. While on the local level we are trained to think of space as having three dimensions, general relativity paints a picture of a four-dimensional universe, and string theory says it has 10 dimensions – or 11 if you take an extended version known as M-Theory. There are variations of the theory in 26 dimensions, and recently pure mathematicians have been electrified by a version describing spaces of 24 dimensions. But what are these ‘dimensions’? And what does it mean to talk about a 10-dimensional space of being?…
Experience says we live in three dimensions; relativity says four; string theory says it’s 10– or more… What are “dimensions” and how do they affect reality? Margaret Wertheim offers a guide: “Radical dimensions.”
* Edwin A. Abbott, Flatland: A Romance of Many Dimensions
###
As we tax our senses, we might spare a thought for Robert Jemison Van de Graaff; he died on this date in 1967. A physicist and engineer, he is best remembered for his creation of the Van de Graaff Generator, an electrostatic generator that creates very high electric potentials– very high voltage direct current (DC) electricity (up to 5 megavolts) at low current levels. A tabletop version can produce on the order of 100,000 volts and can store enough energy to produce a visible spark. Such small Van de Graaff machines are used in physics education to teach electrostatics; larger ones are displayed in some science museums.

Boy touching Van de Graaff generator at The Magic House, St. Louis Children’s Museum. Charged with electricity, his hair strands repel each other and stand out from his head.







You must be logged in to post a comment.