Posts Tagged ‘philosophy’
This is the age of big data. We are constantly in quest of more numbers and more complex algorithms to crunch them. We seem to believe that this will solve most of the world’s problems – in economy, society and even our personal lives. As a corollary, rules of thumb and gut instincts are getting a short shrift. We think they often violate the principles of logic and lead us into making bad decisions. We might have had to depend on heuristics and our gut feelings in agricultural and manufacturing era. But this is digital age. We can optimise everything.
Gerd Gigerenzer [above], a sixty nine year German psychologist who has been studying how humans make decisions for most of his career, doesn’t think so. In the real world, rules of thumb not only work well, they also perform better than complex models, he says. We shouldn’t turn our noses up on heuristics, we should embrace them…
Why simple rules of thumb often outperform complex models: “Gigerenzer’s simple rules.”
* John Gardner
As we extrapolate, we might spare a thought for Marie Jean Antoine Nicolas de Caritat, Marquis of Condorcet; he died on this date in 1794. A philosopher, mathematician, and early political scientist, he was a rationalist (and biographer of Voltaire) who advocated a liberal economy, free and equal public instruction, constitutionalism, and equal rights for women and people of all races. He was a formulator of the Enlightenment ideas of progress and of the indefinite perfectibility of humankind. And with his wife (and intellectual partner) Sophie de Grouchy, he hosted a salon that attracted foreign dignitaries and intellectuals including Thomas Jefferson, Thomas Paine, and Cesare Beccaria. But he may be best remembered for the Condorcet method of voting, in which the tally selects the candidate who would beat each of the other candidates in a run-off election.
Kick at the rock, Sam Johnson, break your bones:
But cloudy, cloudy is the stuff of stones.
– Richard Wilbur
Materialism holds the high ground these days in debates over that most ultimate of scientific questions: the nature of consciousness. When tackling the problem of mind and brain, many prominent researchers advocate for a universe fully reducible to matter. ‘Of course you are nothing but the activity of your neurons,’ they proclaim. That position seems reasonable and sober in light of neuroscience’s advances, with brilliant images of brains lighting up like Christmas trees while test subjects eat apples, watch movies or dream. And aren’t all the underlying physical laws already known?
From this seemly hard-nosed vantage, the problem of consciousness seems to be just one of wiring, as the American physicist Michio Kaku argued in The Future of the Mind (2014). In the very public version of the debate over consciousness, those who advocate that understanding the mind might require something other than a ‘nothing but matter’ position are often painted as victims of wishful thinking, imprecise reasoning or, worst of all, an adherence to a mystical ‘woo.’
It’s hard not to feel the intuitional weight of today’s metaphysical sobriety. Like Pickett’s Charge up the hill at Gettysburg, who wants to argue with the superior position of those armed with ever more precise fMRIs, EEGs and the other material artefacts of the materialist position? There is, however, a significant weakness hiding in the imposing-looking materialist redoubt. It is as simple as it is undeniable: after more than a century of profound explorations into the subatomic world, our best theory for how matter behaves still tells us very little about what matter is. Materialists appeal to physics to explain the mind, but in modern physics the particles that make up a brain remain, in many ways, as mysterious as consciousness itself…
The closer you look, the more the materialist explanation of consciousness (and physics) appears to rest on shaky metaphysical ground: “Minding matter.”
* Albert Einstein, riffing on his friend Kurt Gödel
As we think about thinking, we might spare a thought for Frederick Winslow Taylor; he died on this date in 1915. An engineer and inventor (42 patents), he’s best remembered as the father of “Scientific Management,” the discipline rooted in efficiency studies and standardization. Quoth Peter Drucker:
Frederick W. Taylor was the first man in recorded history who deemed work deserving of systematic observation and study. On Taylor’s ‘scientific management’ rests, above all, the tremendous surge of affluence in the last seventy-five years which has lifted the working masses in the developed countries well above any level recorded before, even for the well-to-do. Taylor, though the Isaac Newton (or perhaps the Archimedes) of the science of work, laid only first foundations, however. Not much has been added to them since – even though he has been dead all of sixty years.
Taylor’s work encouraged many followers (including Frank “Cheaper by the Dozen” Gilbreth) and effectively spawned the field of management consulting. But Taylor practiced what he preached, and found time to become a champion tennis player as well: he won the first doubles tournament (1881) in U.S. National Championships, the precursor of the U.S. Open (with partner Clarence Clark).
“Dimension regulated the general scale of the work, so that the parts may all tell and be effective”*…
Readers will know of your correspondent’s fascination with relative scale– c.f., “Scaling Away,” “Putting vegetables to exquisite use,” and “Nothing can better cure the anthropocentrism that is the author of all our ills than to cast ourselves into the physics of the infinitely large (or the infinitely small),” for example.
As we survey size, we might spare a thought for Saint Thomas Aquinas; he died on this date in 1274. A Dominican friar, Catholic priest, and Doctor of the Church, he was an immensely influential philosopher, theologian, and jurist in the tradition of Scholasticism. Following Aristotle’s definition of science as sure and evident knowledge obtained from demonstrations, Thomas defined science as the knowledge of things from their causes. In his major work, Summa, he distinguished between demonstrated truth (science) and revealed truth (faith). His influence on Western thought is considerable; much of modern philosophy (especially ethics, natural law, metaphysics, and political theory) developed with reference– in support or opposition– to his ideas.
The challenge that the multiverse poses for the idea of an all-good, all-powerful God is often focused on fine-tuning. If there are infinite universes, then we don’t need a fine tuner to explain why the conditions of our universe are perfect for life, so the argument goes. But some kinds of multiverse pose a more direct threat. The many-worlds interpretation of quantum physicist Hugh Everett III and the modal realism of cosmologist Max Tegmark include worlds that no sane, good God would ever tolerate. The theories are very different, but each predicts the existence of worlds filled with horror and misery.
Of course, plenty of thoughtful people argue that the Earth alone contains too much pain and suffering to be the work of a good God. But many others have disagreed, finding fairly nuanced things to say about what might justify God’s creation of a world that includes a planet like ours. For example, there is no forgiveness, courage, or fortitude without at least the perception of wrongs, danger, and difficulty. The most impressive human moral achievements seem to require such obstacles.
Still, many horrifying things happen with nothing seemingly gained from them. And, Everett’s many-worlds and Tegmark’s modal realism both seem to imply that there are huge numbers of horrific universes inhabited solely by such unfortunates. Someone like myself, who remains attracted to the traditional picture of God as loving creator, is bound to find such consequences shocking…
How scientific cosmology puts a new twist on the problem of evil. A theist wrestles with the implications of the “Many World” hypothesis: “Evil Triumphs in These Multiverses, and God Is Powerless.”
* Terry Pratchett
As we calculate our blessings, we might send carefully-addressed birthday greetings to Infante Henrique of Portugal, Duke of Viseu, better known as Prince Henry the Navigator; he was born on this date in 1394. A central figure in 15th-century Portuguese politics and in the earliest days of the Portuguese Empire, Henry encouraged Portugal’s expeditions (and colonial conquests) in Africa– and thus is regarded as the main initiator (as a product both of Portugal’s expeditions and of those that they encouraged by example) of what became known as the Age of Discoveries.
The notion that intelligence could determine one’s station in life… runs like a red thread through Western thought, from the philosophy of Plato to the policies of UK prime minister Theresa May. To say that someone is or is not intelligent has never been merely a comment on their mental faculties. It is always also a judgment on what they are permitted to do. Intelligence, in other words, is political.
Sometimes, this sort of ranking is sensible: we want doctors, engineers and rulers who are not stupid. But it has a dark side. As well as determining what a person can do, their intelligence – or putative lack of it – has been used to decide what others can do to them. Throughout Western history, those deemed less intelligent have, as a consequence of that judgment, been colonised, enslaved, sterilised and murdered (and indeed eaten, if we include non-human animals in our reckoning).
It’s an old, indeed an ancient, story. But the problem has taken an interesting 21st-century twist with the rise of Artificial Intelligence (AI)…
Go mental at “Intelligence: a history.”
Pair with Isaac Asimov’s lighter piece to the same point, “What is intelligence, anyway?”
* Oscar Wilde
As we celebrate variety, we might send thoughtful birthday greetings to Michel Eyquem de Montaigne; he was born on this date in 1533. Best known during his lifetime as a statesman, Montaigne is remembered for popularizing the essay as a literary form. His effortless merger of serious intellectual exercises with casual anecdotes and autobiography– and his massive volume Essais (translated literally as “Attempts” or “Trials”)– contain what are, to this day, some of the most widely-influential essays ever written. Montaigne had a powerful impact on writers ever after, from Descartes, Pascal, and Rousseau, through Hazlitt, Emerson, and Nietzsche, to Zweig, Hoffer, and Asimov. Indeed, he’s believed to have been an influence on the later works of Shakespeare.
“Consciousness cannot be accounted for in physical terms. For consciousness is absolutely fundamental. It cannot be accounted for in terms of anything else.”*…
As a neuroscientist, I am frequently asked about consciousness. In academic discourse, the celebrated problem of consciousness is often divided into two parts: the “Easy Problem” involves identifying the processes in the brain that correlate with particular conscious experiences. The “Hard Problem” involves murkier questions: what are conscious experiences, and why do they exist at all? This neat separation into Easy and Hard problems, which comes courtesy the Australian philosopher David Chalmers, seems to indicate a division of labor. The neuroscientists, neurologists and psychologists can, at least in principle, systematically uncover the neural correlates of consciousness. Most of them agree that calling this the “Easy Problem” somewhat underestimates the theoretical and experimental challenges involved. It may not be the Hard Problem, but at the very least it’s A Rather Hard Problem. And many philosophers and scientists think that the Hard Problem may well be a non-problem, or, as Ludwig Wittgenstein might have said, the kind of problem that philosophers typically devise in order to maximize unsolvability.
One might assume that as a neuroscientist, I should be gung-ho to prove the imperious philosophers wrong, and to defend the belief that science can solve any sort of problem one might throw at it: hard, soft, or half-baked. But I have become increasingly convinced that science is severely limited in what it can say about consciousness. In a very important sense, consciousness is invisible to science…
Yohan John on “Why some neuroscientists call consciousness ‘the C-word’.” Via the always-illuminating 3 Quarks Daily.
* Erwin Schrödinger
As we muse on mind, we might spare a thought for Mary Whiton Calkins; she died on this date in 1930. A psychologist and philosopher, Calkins studied psychology at Harvard as a “guest” (since women could not officially register there in her day). Though she completed all requirements for a doctorate, and had the strong support of William James and her other professors, Harvard still refused to grant a degree to a woman. She went on to become the first prominent woman in her fields: After leaving Harvard, she established the first psychology laboratory at a women’s college (Wellesley), and later became the first female president of both the American Psychological Association and the American Philosophical Association.
Let’s imagine we’re on a beach that’s a mile long, and on that beach there are a couple of ice cream carts…
Let’s also imagine that the ice cream sold at each cart is identical in quality and cost, so the only reason customers choose one cart over the other is when one cart is closer. Given all of that, the best location of the carts is with each cart halfway between the middle of the beach and one of the ends. In this arrangement each cart gets 50% of the customers, and no one has to walk more than 1/4 mile to get some ice cream.
But what if one of the ice cream vendors decides to move their cart a bit closer to the middle of the beach…
They are now the ice cream cart of choice for a bigger segment of the beach, and will get more business. The other ice cream cart has no choice but to retaliate…
Now once again they each serve the same percentage of the beach-going public. Since any further movement by either cart would mean a loss of business for that cart, they end up permanently side by side, in the middle of the beach, even though this is a less optimal location for their customers.
That is a simple example of something called Hotelling’s law; the tendency of competing products to end up as similar as possible…
As we nose around for niches, we might send ambitious birthday greetings to Count Giovanni Pico della Mirandola; he was born on this date in 1463. An Italian philosopher, he undertook, in 1486, at the age of 23, to defend 900 theses on religion, philosophy, natural philosophy and magic against all comers, in the process of which he wrote his famous Oration on the Dignity of Man, which has been called the “Manifesto of the Renaissance”; a revitalization of Neo-Platonism, it was a seminal text of Renaissance humanism and of what has been called the “Hermetic Reformation.”