(Roughly) Daily

Posts Tagged ‘humans

“Don’t let us forget that the causes of human actions are usually immeasurably more complex and varied than our subsequent explanations of them”*…

Further, in a fashion, to yesterday’s post: Patricia Fara explains how the tension between religion and science as arbiters of knowledge came to head in the French Revolution, and how that inspired Lambert Adolphe Jacques Quetelet, a Belgian astronomer, mathematician, statistician, and sociologist, to introduce a radically new way of thinking about human beings:

… God had been forcefully excluded from astronomy during the French Revolution, when Pierre-Simon Laplace rewrote Newton’s ideas to create his deterministic cosmos, in which scientific laws govern every movement of every planet with no need for divine intervention. Inspired by this success, a Belgian astronomer called Alphonse Queteler decided that human societies are also controlled by laws. Each country has its own statistical patterns that remain constant from year to year–suicide and crime rates, for instance–and so Quetelet suggested that an ‘average man’ can consistently encapsulate a nation’s characteristics. Politicians should, Quetelet prescribed, operate like social physicists and try to improve average behaviour rather than worry about extreme anomalies. For him, variations from the statistical mean were–like planetary wobbles–imperfections to be smoothed out so that overall progress could be ensured.

Quetelet had introduced a radically new way of thinking about human beings. As one of his admirers put it, ‘Man is seen to be an enigma only as an individual, in mass, he is a mathematical problem.’ Quetelet’s successors took his ideas in many different directions. For one thing, his work was valuable politically because it could be interpreted in different ways. While conservatives insisted that little could be done to alter the current system, radicals accused governments of impeding the natural course of progress, and Utopians–such as Karl Marx–envisaged harmonious societies governed by nature’s own laws guaranteeing improvement. Data collection projects proliferated, and statisticians searched for laws governing every aspect of life, ranging from the weather to the growth of civilization, from stock market fluctuations to the incidence of disease. Many scientists took their ideas from Quetelet rather than from abstract textbooks–but they added their own twist. Whereas Quetelet regarded individual deviations from the norm as errors to be eliminated, scientists set out to study how variations occur…

An excerpt from Fara’s Science: A Four Thousand Year History, via the invaluable Delanceyplace.com (@delanceyplace): “God, Science, and Data.”

* Fyodor Dostoevsky, The Idiot

###

As we focus on frames, we might spare a thought for a man who kept his eye on the individual, Wilhelm Reich. A medical doctor and psychoanalyst, he was a member of the second generation of analysts after Sigmund Freud. Reich developed a system of psychoanalysis concentrating on overall character structure, rather than on individual neurotic symptoms. His early work on psychoanalytic technique was overshadowed by his involvement in the sexual-politics movement and by “orgonomy,” a pseudoscientific system he developed. He also built a device he called a cloud buster, with which he claimed he could manipulate the weather by manipulating the “orgone” in the atmosphere. Reich’s claims aroused much controversy; and he was taken to court for fraud by the Food and Drug Administration (FDA). The court ordered his books and research burned and his equipment destroyed. Reich was sentenced to prison where he died of heart failure on this date in 1957.

source

“Many of the things you can count, don’t count. Many of the things you can’t count, really count”*…

Still, we count… and have, as Keith Houston explains, for much, if not most of human history…

Figuring out when humans began to count systematically, with purpose, is not easy. Our first real clues are a handful of curious, carved bones dating from the final few millennia of the three-​million-​year expanse of the Old Stone Age, or Paleolithic era. Those bones are humanity’s first pocket calculators: For the prehistoric humans who carved them, they were mathematical notebooks and counting aids rolled into one. For the anthropologists who unearthed them thousands of years later, they were proof that our ability to count had manifested itself no later than 40,000 years ago.

Counting, fundamentally, is the act of assigning distinct labels to each member of a group of similar things to convey either the size of that group or the position of individual items within it. The first type of counting yields cardinal numbers such as “one,” “two,” and “three”; the second gives ordinals such as “first,” “second,” and “third.”

At first, our hominid ancestors probably did not count very high. Many body parts present themselves in pairs—​arms, hands, eyes, ears, and so on—​thereby leading to an innate familiarity with the concept of a pair and, by extension, the numbers 1 and 2. But when those hominids regarded the wider world, they did not yet find a need to count much higher. One wolf is manageable; two wolves are a challenge; any more than that and time spent counting wolves is better spent making oneself scarce. The result is that the very smallest whole numbers have a special place in human culture, and especially in language. English, for instance, has a host of specialized terms centered around twoness: a brace of pheasants; a team of horses; a yoke of oxen; a pair of, well, anything. An ancient Greek could employ specific plurals to distinguish between groups of one, two, and many friends (ho philosto philo, and hoi philoi). In Latin, the numbers 1 to 4 get special treatment, much as “one” and “two” correspond to “first” and “second,” while “three” and “four” correspond directly with “third” and “fourth.” The Romans extended that special treatment into their day-​to-​day lives: after their first four sons, a Roman family would typically name the rest by number (Quintus, Sextus, Septimus, and so forth), and only the first four months of the early Roman calendar had proper names. Even tally marks, the age-​old “five-​barred gate” used to score card games or track rounds of drinks, speaks of a deep-​seated need to keep things simple.

Counting in the prehistoric world would have been intimately bound to the actual, not the abstract. Some languages still bear traces of this: a speaker of Fijian may say doko to mean “one hundred mulberry bushes,” but also koro to mean “one hundred coconuts.” Germans will talk about a Faden, meaning a length of thread about the same width as an adult’s outstretched arms. The Japanese count different kinds of things in different ways: there are separate sequences of cardinal numbers for books; for other bundles of paper such as magazines and newspapers; for cars, appliances, bicycles, and similar machines; for animals and demons; for long, thin objects such as pencils or rivers; for small, round objects; for people; and more.

Gradually, as our day-​to-​day lives took on more structure and sophistication, so, too, did our ability to count. When farming a herd of livestock, for example, keeping track of the number of one’s sheep or goats was of paramount importance, and as humans divided themselves more rigidly into groups of friends and foes, those who could count allies and enemies had an advantage over those who could not. Number words graduated from being labels for physical objects into abstract concepts that floated around in the mental ether until they were assigned to actual things.

Even so, we still have no real idea how early humans started to count in the first place. Did they gesture? Speak? Gather pebbles in the correct amount? To form an educated guess, anthropologists have turned to those tribes and peoples isolated from the greater body of humanity, whether by accident of geography or deliberate seclusion. The conclusion they reached is simple. We learned to count with our fingers…

From an excerpt from Houston’s new book, Empire of the Sum: The Rise and Reign of the Pocket Calculator: “The Early History of Counting,” @OrkneyDullard in @laphamsquart.

* Albert Einstein

###

As we tally, we might send carefully calculated birthday greetings to Stephen Wolfram; he was born on this date in 1959. A computer scientist, mathematician, physicist, and businessman, he has made contributions to all of these fields. But he is probably best known for his creation of the software system Mathematica (a kind of “idea processor” that allows scientists and technologists to work fluidly in equations, code, and text), which is linked to WolframAlpha (an online answer engine that provides additional data, some of which is kept updated in real time).

source

Written by (Roughly) Daily

August 29, 2023 at 1:00 am

“Homo sapiens, the only creature endowed with reason, is also the only creature to pin its existence on things unreasonable”*…

We appeared 800,000-300,000 years ago, or in the last 1.5%-5.3% of hominid history

How, Sarah Constantin asks, did we humans get so smart?

If you zoom way out and look at the history of life on Earth, humans evolved incredibly recently. The Hominidae — the family that includes orangutans, chimpanzees, bonobos, gorillas, and humans — only arose 20 million years ago, in the most recent 0.5% of evolutionary history.

Within the Hominidae, in turn, Homo sapiens is a very recent development [see image at top]. We appeared 800,000-300,000 years ago, or in the last 1.5%-5.3% of hominid history.

If you look at early hominid “technological” milestones like tool use or cooking, though, they’re a lot more spread out over time. That’s interesting.

There’s nothing to suggest that a single physical change in brains should have given us both tool use and fire, for instance; if that were the case, you’d expect to see them show up at the same time.

Purposeful problem-solving behaviors like tool use and cooking are not unique to hominids; some other mammals and birds use tools, and lots of vertebrates (including birds and fish) can learn to solve puzzles to get a food reward. The general class of “problem-solving behavior” that we see, to one degree or another, in many vertebrates, doesn’t seem to have arisen surprisingly fast compared to the existence of animals in general.

However, to the extent that Homo sapiens has unique cognitive abilities, those did show up surprisingly recently, and it makes sense to privilege the hypothesis that they have a common physical cause.

So what are these special human-unique cognitive abilities?…

Is Human Intelligence Simple? Part 1: Evolution and Archaeology,” from @s_r_constantin. Part 2 is here.

* Henri Bergson

###

As we study our species, we might send self-examining birthday greetings to Giambattista Vico; he was born on this date in 1668.  A political philosopher, rhetorician, historian, and jurist, Vico was one of the greatest Enlightenment thinkers.  Best known for the Scienza Nuova (1725, often published in English as New Science), he famously criticized the expansion and development of modern rationalism and was an apologist for classical antiquity.

He was an important precursor of systemic and complexity thinking (as opposed to Cartesian analysis and other kinds of reductionism); and he can be credited with the first exposition of the fundamental aspects of social science (and so, is considered by many to be the first forerunner of cultural anthropology and ethnography), though his views did not necessarily influence the first social scientists.  Vico is often claimed to have fathered modern philosophy of history (although the term is not found in his text; Vico speaks of a “history of philosophy narrated philosophically’).  While he was not strictly speaking a historicist, interest in him has been driven by historicists (like Isaiah Berlin).

 source

“Human history seems to me to be one long story of people sweeping down—or up, I suppose—replacing other people in the process”*…

Max Roser argues that, if we keep each other safe – and protect ourselves from the risks that nature and we ourselves pose – we are only at the beginning of human history…

… The development of powerful technology gives us the chance to survive for much longer than a typical mammalian species.

Our planet might remain habitable for roughly a billion years. If we survive as long as the Earth stays habitable, and based on the scenario above, this would be a future in which 125 quadrillion children will be born. A quadrillion is a 1 followed by 15 zeros: 1,000,000,000,000,000.

A billion years is a thousand times longer than the million years depicted in this chart. Even very slow moving changes will entirely transform our planet over such a long stretch of time: a billion years is a timespan in which the world will go through several supercontinent cycles – the world’s continents will collide and drift apart repeatedly; new mountain ranges will form and then erode, the oceans we are familiar with will disappear and new ones open up…

… the future is big. If we keep each other safe the huge majority of humans who will ever live will live in the future.

And this requires us to be more careful and considerate than we currently are. Just as we look back to the heroes who achieved what we enjoy today, those who come after us will remember what we did for them. We will be the ancestors of a very large number of people. Let’s make sure we are good ancestors…

If we manage to avoid a large catastrophe, we are living at the early beginnings of human history: “The Future is Vast,” from @MaxCRoser @OurWorldInData.

* Alexander McCall Smith

###

As we take the long view, we might recall that it was on this date in 1915 that Mary Mallon, “Typhoid Mary,” was put in quarantine on North Brother Island, in New York City, where she was isolated until she died in 1938.  She was the first person in the United States identified as an asymptomatic carrier of the pathogen associated with typhoid fever… before which, she first inadvertently, then knowingly spread typhoid for years while working as a cook in the New York area.

Mallon had previously been identified as a carrier (in 1905) and quarantined for three years, after which she was set free on the condition she changed her occupation and embraced good hygiene habits. But after working a lower paying job as a laundress, Mary changed her last name to Brown and returned to cooking… and over the next five years the infectious cycle returned, until she was identified and put back into quarantine.

source

“We are a species which is naturally moved by curiosity, the only one left of a group of species (the genus Homo) made up of a dozen equally curious species”*…

… and one thing that curiosity might lead us to wonder is where evolution might take humanity from here. As Nick Longrich points out…

Discussions of human evolution are usually backward looking, as if the greatest triumphs and challenges were in the distant past. But as technology and culture enter a period of accelerating change, our genes will too. Arguably, the most interesting parts of evolution aren’t life’s origins, dinosaurs, or Neanderthals, but what’s happening right now, our present – and our future.

He reasons to some fascinating possibilities…

Humanity is the unlikely result of 4 billion years of evolution.

From self-replicating molecules in Archean seas, to eyeless fish in the Cambrian deep, to mammals scurrying from dinosaurs in the dark, and then, finally, improbably, ourselves – evolution shaped us.

Organisms reproduced imperfectly. Mistakes made when copying genes sometimes made them better fit to their environments, so those genes tended to get passed on. More reproduction followed, and more mistakes, the process repeating over billions of generations. Finally, Homo sapiens appeared. But we aren’t the end of that story. Evolution won’t stop with us, and we might even be evolving faster than ever.

It’s hard to predict the future. The world will probably change in ways we can’t imagine. But we can make educated guesses. Paradoxically, the best way to predict the future is probably looking back at the past, and assuming past trends will continue going forward. This suggests some surprising things about our future…

Meet our future selves: “Future evolution: from looks to brains and personality, how will humans change in the next 10,000 years?“– @NickLongrich in @ConversationUS.

* Carlo Rovelli, Seven Brief Lessons on Physics

###

As we ponder the possible, we might send insightful birthday greetings to Richard Dawkins; he was born on this date in 1947. An evolutionary biologist, he made a number of important contributions to the public understanding of evolution. In his 1976 book The Selfish Gene, he popularized the gene-centred view of evolution and introduced the term meme. In The Extended Phenotype (1982), he introduced the influential concept that the phenotypic effects of a gene are not necessarily limited to an organism’s body, but can stretch far into the environment. And in The Blind Watchmaker (1986), he argued against the watchmaker analogy, an argument for the existence of a supernatural creator based upon the complexity of living organisms; instead, he described evolutionary processes as analogous to a blind watchmaker, in that reproduction, mutation, and selection are unguided by any designer.

source