(Roughly) Daily

Posts Tagged ‘Animation

“I was conscious that I knew practically nothing”*…

As Joshua Rothman reminds us, we have a lot to learn from studying our ignorance…

… The truth, of course, is that we’re ignorant about the future. Who will win the election in November? Will we lose our jobs because of A.I.? Will the planet boil or merely simmer? What will skyscrapers, or smartphones, or schools look like in thirty years? We’re not in the dark about these questions; we can make educated guesses or predictions. But there’s an odd way in which, the more informed our speculations become, the more they serve to highlight what we don’t know. “The knowledge we possess determines the degree of specificity of the ignorance we recognize,” the philosopher Daniel DeNicola writes, in his book “Understanding Ignorance.” The more you know, the more precisely you can say what you don’t.

DeNicola’s book is an entry in a subfield of philosophy called “agnotology”—the study of ignorance. As philosophical subfields go, agnotology sounds abstract and even a little contradictory: what could it even mean to study what’s unknown? And yet, because ignorance is actually an everyday condition from which we all suffer, the study of it is quite down to earth. Have you ever been in a bookstore, leafed through a weighty tome, and then returned it to the shelf? You are practicing “rational ignorance,” DeNicola writes, by making “the more-or-less conscious decision that something is not worth knowing—at least for me, at least not now.” (In an information-rich society, he notes, knowing when to maintain this kind of ignorance is actually an important skill.) Have you ever tuned out a gossipy friend because you don’t want to know who said what about whom? Deciding that you’d rather be above the fray is “strategic ignorance”; you embrace it because it will make life better, deploying it when you decide not to read the reviews before seeing a movie, or conduct a hiring process in which the names of the candidates are obscured. There’s a big difference between strategic ignorance and what DeNicola calls “involuntary” ignorance: “In the iconic image, Justice is blindfolded, not blind,” he writes.

My wife’s parents have a box of letters that were sent between her grandfather and her grandmother while he was serving in the Navy during the Second World War. The box is in the basement; no one has read the letters, and no one plans to. This reflects a valid concern for privacy, but it also involves what DeNicola calls “willful ignorance”—the persistent, long-term maintenance of a gap in one’s knowledge that could easily be filled in. Willful ignorance isn’t necessarily bad; it might be wise to avoid learning the disturbing details of a half-forgotten traumatic event, for instance, lest they keep the trauma fresh. But we should be wary of willful ignorance, DeNicola argues, because it often flows from fear. “Consider a mother who is so upset about her son’s military service that she refuses to discuss it while he remains on active duty,” DeNicola writes. Or a voter who refuses to read about a favored candidate’s ongoing scandal. “The benefits of willful ignorance tend to be overestimated by those who exhibit it”; knowledge can be a path to overcoming fear.

DeNicola argues that, even when we don’t choose ignorance, there are ways in which we must “dwell in ignorance,” no matter what we do. We’re ignorant of most of what happened in the past because, despite our efforts at historical reconstruction, “worlds disappear” in the flow of time. We’re ignorant about the future not just because we don’t know what will happen but because we lack the ideas needed to comprehend future knowledge: “Galileo could not have known that solar flares produce bursts of radiation,” for example, because the very idea of radiation depends on a “framework of theoretical concepts” that wasn’t developed until hundreds of years after he lived. It turns out that there’s a special word, “ignoration,” which describes the condition of people who “do not even know that they do not know.” In a broad, almost existential sense, we all live in ignoration all the time. Recognizing this makes knowing what you don’t know feel like a step forward—even an opportunity to be seized…

… In a recent book called “Sense, Nonsense, and Subjectivity,” a German philosopher named Markus Gabriel argues that our personhood is partly based on ignorance—that “to be someone, to be a subject, is to be wrong about something.” It’s intuitive to hold the opposite view—to say that we are the sum of what we know. But Gabriel points out that, even when you know something to be true, you probably also know that there are aspects of it about which you’re probably wrong. I encountered this phenomenon recently when my son asked me to explain the meaning of “E=mc2”—but, also, when I tried to tell him about how I’d met his mom. “We were riding up in an elevator, and we started talking, and then she got off,” I said. “And then, later, when I was riding down, she got back on.”

This story is true, but also wreathed in inevitable uncertainties. What exactly did we say to one another? What were we wearing, or thinking, or feeling, before and after? There are limits to recollection, and to noticing in the moment; life is short, and you can’t know it all, not even about yourself. But you can know, at least to some extent, what you chose not to know and what you wished you’d found out. You can understand what you looked away from, and toward…

What Don’t We Know?” from @joshuarothman in @NewYorker.

* Socrates, from Plato, Apology 22d

###

As we noodle on nescience, we might send bodacious birthday greetings to that most fabulous of flappers, Betty Boop; she made her first appearance on this date in 1930.  The creation of animator Max Fleischer, she debuted in “Dizzy Dishes” (in which, still unevolved as a character, she is drawn as an anthropomorphic female dog).

Written by (Roughly) Daily

August 9, 2024 at 1:00 am

“A public-opinion poll is no substitute for thought”*…

Opinion polls are a key accelerant in the inflamed civil discourse of our time. And, as Teresa Carr explain, that’s a problem…

Last December, a joint survey by the Economist and the polling organization YouGov claimed to reveal a striking antisemitic streak among America’s youth. One in five young Americans thinks the Holocaust is a myth, according to the poll. And 28 percent think Jews in America have too much power.

“Our new poll makes alarming reading,” declared the Economist. The results inflamed discourse over the Israel-Hamas war on social media and made international news.

There was one problem: The survey was almost certainly wrong. The Economist/YouGov poll was a so-called opt-in poll, in which pollsters often pay people they’ve recruited online to take surveys. According to a recent analysis from the nonprofit Pew Research Center, such polls are plagued by “bogus respondents” who answer questions disingenuously for fun, or to get through the survey as quickly as possible to earn their reward.

In the case of the antisemitism poll, Pew’s analysis suggested that the Economist/YouGov team’s methods had yielded wildly inflated numbers. In a more rigorous poll posing some of the same questions, Pew found that only 3 percent of young Americans agreed with the statement “the Holocaust is myth.”

These are strange times for survey science. Traditional polling, which relies on responses from a randomly selected group that represents the entire population, remains the gold standard for gauging public opinion, said Stanford political scientist Jon Krosnick. But as it’s become harder to reach people on the phone, response rates have plummeted, and those surveys have grown exponentially more expensive to run. Meanwhile, cheaper, less-accurate online polls have proliferated.

“Unfortunately, the world is seeing much more of the nonscientific methods that are put forth as if they’re scientific,” said Krosnick…

… headlines as outrageous as they are implausible continue to proliferate: 7 percent of American adults think chocolate milk comes from brown cows; 10 percent of college graduates think Judge Judy is on the Supreme Court; and 4 percent of American adults (about 10 million people) drank or gargled bleach to prevent Covid-19. And although YouGov is one of the more respected opt-in pollsters, some of its findings — one third of young millennials aren’t sure the Earth is round, for example — strain credulity.

Amidst a sea of surveys, it’s hard to distinguish solid findings from those that dissolve under scrutiny. And that confusion, some experts say, reflects deep-seated problems with new methods in the field — developed in response to a modern era in which a representative sample of the public no longer picks up the phone.

The fractious evolution in polling science is likely to receive fresh attention as the 2024 elections heat up, not least because the consequences of failed or misleading surveys can go well beyond social science. Such “survey clickbait” erodes society’s self-esteem, said Duke University political scientist Sunshine Hillygus: It “undermines people’s trust that the American public is capable of self-governance.”

Veteran pollster Gary Langer compares traditional randomized polling methods, known as probability polling, to dipping a ladle into a well-stirred pot of minestrone soup. “We can look in and see some cannellini beans, little escarole, chunks of tomato,” he said. “We get a good representation of what’s in the soup.”

It doesn’t matter if the pot is the size of Yankee Stadium, he said. If the contents are thoroughly mixed, one ladle is enough to determine what’s in it. That’s why probability surveys of 1,000 people can, in theory, represent what the entire country thinks.

The problem is that getting a truly representative sample is virtually impossible, said YouGov’s Douglas Rivers, who pointed out that these days a good response rate to a randomized poll is 2 percent…

… with the appropriate guardrails against fraud, YouGov chief scientist Rivers said, such methods offer a practical alternative to conventional probability sampling, where the costs are too high, and the response rates are too low. In some sense, he suggested, most polling is now nonprobability polling: When only 2 out of 100 people respond to a survey, it’s much harder to claim that those views are representative, said Rivers. “Sprinkling a little bit of randomness at the initial stage does not make it a probability sample.”

“Our approach has been: Let us assemble a sample systematically based on characteristics,” said Rivers. “It’s not comparable to what the census does in the current population survey, but it’s performed very well in election polling.” Rivers pointed to YouGov’s high ranking on the website FiveThirtyEight, which rates polling firms based on their track record in predicting election results and willingness to show their methods.

Gary Langer was not particularly impressed by high marks from FiveThirtyEight. (His own firm, Langer Research Associates, also gets a top grade for political polling they conduct on behalf of the partnership between ABC News and The Washington Post.) “Pre-election polls, while they get so much attention, are the flea on the elephant of the enterprise of public opinion research,” he said. The vast majority of surveys are concerned with other topics. They form the basis of federal data on jobs and housing, for example, and can reflect the public’s views on education, climate change, and other issues. “Survey data,” he said, “surrounds us, informs our lives, informs the choices we make.”

Given the stakes, Langer relies exclusively on probability polling. Research shows that opt-in polls just don’t produce the same kind of consistent, verifiable results, said Langer…

Research suggests that widely used nonprobability methods, in particular online opt-in polls such as the Economist/YouGov survey, have inherent vulnerabilities.

The prospect of cash or rewards can incentivize some people to complete surveys quickly and with as little effort as possible. “They’re giving you data and answers that just can’t possibly be true,” said Kennedy.

For example, in one test of opt-in polling, 12 percent of U.S. adults younger than 30 claimed that they were licensed to operate a nuclear submarine. The true figure, of course, is approximately 0 percent…

… Media consumers should be skeptical of implausible findings, said Krosnick. So should reporters, said Langer, who spent three decades as a journalist, and who said news outlets have a responsibility to vet the polls they report on: “Every newsroom in the country — in the world — should have someone on their team evaluate surveys and survey methodologies.”

In the end, people need to realize that survey research involves some degree of uncertainty, said Joshua Clinton, a political scientist at Vanderbilt University, who noted that polls leading up to the 2024 election are bound to get something wrong. “My concern is what that means about the larger inferences that people make about not only polling, but also science in general,” he said. People may just dismiss results as a predictable scientific failure: “‘Oh, the egghead screwed up again.’” Clinton said he wants people to recognize the difficulty of doing social science research, rather than to delegitimize the field outright.

Even Rivers, whose firm produced the Economist poll that made headlines, acknowledged that readers should be cautious with eye-catching headlines. “We’re in a challenging environment for conducting surveys,” he said. That means that people need to take survey results — especially those that are provocative — with a grain of salt.

“The tendency is to overreport polls,” said Rivers. “The polls that get reported are the ones that are outliers.”…

It’s very difficult to get anyone to answer a phone call—and that’s skewing data on everything from chocolate milk to antisemitism: “We’re in a New Era of Survey Science,” from @TeresaRCarr in @undark via @Slate. Eminently worth reading in full.

* Warren Buffett

###

As we take it with a grain of salt, we might recall that it was on this date in 1941 that Tom and Jerry first appeared on screen with those names in the MGM cartoon “The Midnight Snack,” though it was in fact their second screen appearance.

In 1940, MGM had produced “Puss Gets the Boot,” based on Hanna’s and Barbera’s pitch for a story rooted in two “equal characters who were always in conflict with each other.”  It was the first collaboration between William Hanna and Joseph Barbera (founding a partnership that would last over 50 years and yield such treasures as The FlintstonesHuckleberry HoundThe JetsonsScooby-DooTop Cat, and Yogi Bear); at over nine minutes in length, it’s the longest T&J ever produced– and the first of three T&J essays (with “Puss n’ Toots” and “Puss ‘n’ Boats”) to pun it’s title on the fairy tale “Puss in Boots.”  “Puss Gets the Boot” was nominated for an Academy Award– the first of Hanna and Barbera’s many Oscar nominations.

The cat in “Puss Gets the Boot” was actually named “Jasper”; the mouse, “Jinx.”  But when the pilot got the go-ahead to become a series, animator John Carr won a studio-wide naming contest with his suggestion: “Tom and Jerry.”  The cat’s owner, “Mammy Two-Shoes,” was voiced by June Foray— who later earned immortality as the voice of Rocky J. Squirrel.

“A horse! a horse! my kingdom for a horse!”*…

The horse transformed human history—and now, as Christina Larson reports, scientists have a clearer idea of when humans began to transform the horse…

Around 4,200 years ago, one particular lineage of horse quickly became dominant across Eurasia, suggesting that’s when humans started to spread domesticated horses around the world, according to research published [recently] in the journal Nature.

There was something special about this horse: It had a genetic mutation that changed the shape of its back, likely making it easier to ride.

“In the past, you had many different lineages of horses,” said Pablo Librado, an evolutionary biologist at the Spanish National Research Council in Barcelona and co-author of the new study. That genetic diversity was evident in ancient DNA samples the researchers analyzed from archaeological sites across Eurasia dating back to 50,000 years ago.

But their analysis of 475 ancient horse genomes showed a notable change around 4,200 years ago.

That’s when a specific lineage that first arose in what’s known as the Pontic-Caspian Steppe, a plains region that stretches from what is now northeastern Bulgaria across Ukraine and through southern Russia, began to pop up all across Eurasia and quickly replaced other lineages. Within three hundred years, the horses in Spain were similar to those in Russia.

“We saw this genetic type spreading almost everywhere in Eurasia—clearly this horse type that was local became global very fast,” said co-author Ludovic Orlando, a molecular archaeologist at the Centre for Anthropobiology and Genomics of Toulouse in France.

The researchers believe that this change was because a Bronze Age people called the Sintashta had domesticated their local horse and begun to use these animals to help them dramatically expand their territory.

Domesticating wild horses on the plains of Eurasia was a process, not a single event, scientists say.

Archaeologists have previously found evidence of people consuming horse milk in dental remains dating to around 5,500 years ago, and the earliest evidence of horse ridership dates to around 5,000 years ago. But it was the Sintashta who spread the particular horses they had domesticated across Eurasia, the new study suggests…

People had domesticated other animals several thousand years before horses—including dogs, pigs, cattle, goats and sheep. But the new research shows that the shrinking genetic diversity associated with domestication happened much faster in horses.

“Humans changed the horse genome stunningly quickly, perhaps because we already had experience dealing with animals,” said Laurent Frantz, who studies the genetics of ancient creatures at the Ludwig Maximilian University of Munich and was not involved in the study.

“It shows the special place of horses in human societies.”…

Scientists have traced the origin of the modern horse to a lineage that emerged 4,200 years ago,” from @larsonchristina in @physorg_com.

* Shakespeare, Richard III

###

As we mount up, we might recall that it was on this date in 1878 that Eadweard Muybridge took a series of photographs to prove that all four feet of a horse leave the ground when it runs. He had been retained by former California Governor (and university founder) Leland Stanford to help settle a bet. While Muybridge was best known in his own day for his large photographs of Yosemite Valley, he did seminal early work on motion picture projection, and the approaches he developed for the study of motion are at the heart of both animation and computer analysis today.

source

“This is not your average, everyday darkness. This is… ADVANCED darkness.”*…

As Rob Beschizza explains, Pere Rosselló, an astrophysics student at Universidad de La Laguna in Tenerife, Spain, has created an animation depicting the gravitational collapse of Spongebob

Beschizza muses…

Just imagine being part of a civilization on the cusp of attaining a decent model of the universe’s origins—somewhere between Halley and Lemaître, and you start plotting backwards from where we are and where the Big Bang should be you find Spongebob instead. Running the numbers again and again. Such a universe has no need of Lovecraft, cosmic horror would be right there in the maths.

Rosselló [also] solved a three-body problem: the one of animating three bodies to look really cool

N-body simulation of the gravitational collapse of Spongebob Squarepants,” by @PeRossello via @Beschizza in @BoingBoing.

* SpongeBob, “Rock Bottom

###

As we deconstruct deconstruction, we might recall that it was on this date (in an unspecified year) that SpongeBob met the green seahorse Mystery.

from the full episode “My Pretty Seahorse”

“Clothes make the man. Naked people have little or no influence on society.”*…

Chengdu loom model (digital reconstruction). Photo courtesy China National Silk Museum, Hangzhou, Zhejiang province

On the defining characteristic of civilization: Peter Frankopan, Marie-Louise Nosch, and Feng Zhao on the history of textiles, with special emphasis on silk…

Some say that history begins with writing; we say that history begins with clothing. In the beginning, there was clothing made from skins that early humans removed from animals, processed, and then tailored to fit the human body; this technique is still used in the Arctic. Next came textiles. The first weavers would weave textiles in the shape of animal hides or raise the nap of the fabric’s surface to mimic the appearance of fur, making the fabric warmer and more comfortable.

The shift from skin clothing to textiles is recorded in our earliest literature, such as in the Babylonian Epic of Gilgamesh, where Enkidu, a wild man living on the Mesopotamian steppe, is transformed into a civilised being by the priestess Shamhat through sex, food and clothing. Judaism, Christianity and Islam all begin their accounts of their origins with a dressing scene. A naked Adam and Eve, eating from the forbidden tree, must flee the Garden of Eden. They clothe themselves and undertake a new way of life based on agriculture and animal husbandry. The earliest textile imprints in clay are some 30,000 years old, much older than agriculture, pottery or metallurgy…

… The technology behind silk had long been a historical puzzle. The recent archaeological discovery of a 2nd-century BCE Han dynasty burial chamber of a woman in Chengdu has now solved it. Her grave contained a miniature weaving workshop with wooden models of doll-sized weavers operating pattern looms with an integrated multi-shaft mechanism and a treadle and pedal to power the loom [see illustration above]. Europeans wouldn’t devise the treadle loom, which enhances power, precision and efficiency, for another millennium.

This technology, known as weft-faced compound tabby, also emerged in the border city of Dura-Europos in Syria and in Masada in Israel, dating to the 70s CE. We can, however, be confident that the technique known as taqueté was first woven with wool fibre in the Levant. From there, it spread east, and the Persians and others turned it into a weft-faced compound twill called samite. Samites became the most expensive and prestigious commodity on the western Silk Roads right up until the Arab conquests. They were highly valued international commodities, traded all the way to Scandinavia….

… The word ‘text’ comes from Latin texere (‘to weave’), and a text – morphologically and etymologically – indicates a woven entity. We can therefore say that history starts not with writing but with clothing. Before history, there was nudity, at least in the Abrahamic tradition; clothing thus marks the beginning of history and society. The representation of nudity as part of a wild and pre-civilised life mirrors the European colonial perspective of the naked human as ‘wild.’

Across the world today, there are two main ways to dress: gendered into male and female, and stylistically into clothing tailored to fit the body, or draped/wrapped around it like the Roman toga or the Indian sari. Fitted clothing dominates globally, especially after the Second World War, with blue jeans and T-shirts now ubiquitous across all continents.

Today, a T-shirt on sale in any shop around the world is the result of a finely meshed web of global collaboration, trade and politics. From cotton fields in Texas or Turkmenistan, to spinning mills in China, garment factories in Southeast Asia, printers in the West, and second-hand clothing markets in Africa, a T-shirt travels thousands of kilometres around the world in its lifetime. On average, a Swede purchases nine T-shirts annually, and even if they are made to last 25 to 30 washes, consumers tend to discard them before. Greenpeace found that Europeans and North Americans, on average, hold on to their clothes for only three years. Some garments last only for one season, either because they fall out of fashion, or because the quality of the fabric, tailoring and stitching is so poor that the clothes simply fall apart.

This is the impact of fast fashion that has taken hold since the beginning of the 21st century: for millennia, clothing had always been expensive, worth repairing and maintaining, and made to last. Along with the acceleration of consumption came falling prices and an ever-narrowing margin for profit. The fast-fashion business model requires seamless global trade, inexpensive long-distance transportation, cheap flexible labour and plentiful natural resources. That equation is changing in a world that is warming and where trade barriers are coming up. The future of fabrics, textiles and clothing is bound up in the great themes of the present – and the future…

Eminently worth reading in full: “A silken web,” from @peterfrankopan and @NoschMarie in @aeonmag.

For more, see the full UNESCO report from a chapter in which (“The World Wide Web”) this was adapted: Textiles and Clothing Along the Silk Roads (2022)

* Mark Twain

###

As we dress, we might recall that it was on this date that American audiences in America first encountered a heroine whose costumes went from regal to humble then back to regal: Snow White and the Seven Dwarfs premiered. An animated musical fantasy film produced by Walt Disney Productions and released by RKO Radio Pictures, it was based on the 1812 German fairy tale by the Brothers Grimm (here and here), it is the first full-length cel animated feature film and the first Disney feature film. 

source