(Roughly) Daily

Posts Tagged ‘anthropology

“Horror is a universal language; we’re all afraid”*…

The Exorcist

It’s that time of year again…

Why do so many people voluntarily seek out entertainment that is designed to shock and scare them? What do they get out of it? A thrill, a jolt to the nervous system – or is there something deeper going on?

Horror movies come in various forms, which can be divided into two main subgenres: supernatural ones (think of wailing ghosts, rotting zombies or mind-shattering abominations from forbidden dimensions), and the more psychological (your masked-serial-killers and giant-reptiles varieties). Common to them all is that they aim to evoke negative emotions, such as fear, anxiety, disgust and dread. They also tend to be enormously popular. According to a survey my colleagues and I conducted a few years ago, more than half of US respondents – about 55 per cent – say they enjoy ‘scary media’, including movies such as The Exorcist (1973), books such as King’s Salem’s Lot (1975) and video games such as Amnesia: The Dark Descent (2010).

What’s more, people who say they enjoy scary media really mean it. We also asked our respondents how frightening they wanted their horror to be. It might sound like a weird thing to ask – like asking how funny they want their comedies to be – but we wanted to test an old Freudian idea that the negative emotions elicited by the genre are unfortunate byproducts; a price that audiences are willing to pay in order to watch movies that allow them to confront their own repressed desires in monstrous disguise. But that’s not what we found. About 80 per cent of our respondents said they wanted their horror entertainment to be in the moderate-to-highly frightening range. By contrast, a measly 3.9 per cent said that they prefer horror that’s not scary at all.

So, fear and the other negative emotions are central to the appeal of horror, a fact not lost on the creators of horror entertainment. Surely you’ve seen movie trailers claiming to be ‘The scariest movie of all time!’ or promising to make you sleep with the lights on for weeks afterwards. More inventively, the US filmmaker William Castle once took out life insurance on his audience. If any audience member died from fear as they watched his movie Macabre (1958), their bereaved ones would receive $1,000 from Lloyd’s of London. (Nobody did die. But the gimmick surely drew more horror hounds to the picture.)

Unsurprisingly, given their appeal, horror movies are big business. In 2019, 40 new horror movies were released in North America, grossing more than $800 million in the domestic theatrical market alone. Likewise, the US haunted attractions industry is growing steadily, in 2019 generating up to $500 million in ticket sales. The following year, 2020, naturally saw lower numbers, but even in that year of COVID-19 lockdowns and empty movie theatres, horror movies broke all previous records in terms of market share. That development continued into 2021, with the horror genre now accounting for almost 20 per cent of the market share at the US box office. Evidently, people want scary entertainment, even when you’d think the real world was scary enough.

Despite the broad appeal of the horror genre, it is haunted by bias and prejudice. Many people, apparently, think that horror movies are dumb, dangerous or both – artistically unsophisticated, morally corrosive, and psychologically harmful, with a dubious appeal primarily for maladjusted teenage boys. But what does the science say?

Firstly, horror is not a particularly male genre. While boys and men are slightly more likely than girls and women to say that they enjoy horror, the difference is much smaller than many people seem to think. In our aforementioned survey, when we asked to what extent respondents agree with the statement ‘I tend to enjoy horror media’, on a scale from 1 to 5, men averaged at 3.50, whereas women averaged at 3.29.

Secondly, horror movies are not only watched by teenagers. Yes, the movies are often marketed to that audience, and the appetite for horror does seem to peak in late adolescence, but it doesn’t emerge out of the blue the day that kids turn 13, and it doesn’t disappear in older people either. An ongoing research project of ours is finding that the desire to derive pleasure from fear is evident even in toddlers, who universally enjoy mildly scary activities, such as chase play and hide-and-seek. Even old folks seem to enjoy the occasional thrill provided by mildly frightening media such as crime shows. The British crime drama Midsomer Murders (1997-) always seemed to me like light horror for seniors, with its eerie theremin theme tune and the inexplicably abundant, often startlingly grisly murders in the otherwise peaceful fictional Midsomer County.

Thirdly, there is no evidence that horror fans are particularly maladjusted, depraved or unempathetic. When my colleagues and I looked into the personality profile of horror fans, we found that they are about as conscientious, agreeable and emotionally stable as the average person, while also scoring higher than average on openness to experience (meaning that they enjoy intellectual stimulation and adventure). It’s true they do tend to score fairly highly on sensation seeking, which suggests that they tend to be easily bored and on the lookout for excitement. Maladjusted or depraved, though? Nope, no evidence.

If horror movies do not attract the maladjusted and the depraved, do they then create psychotic monsters? One might think so, judging from the moral panics that have surrounded the horror genre throughout its recent history, from Victorian-era concern over ‘penny dreadfuls’ – sensationalist, often spooky or grisly stories sold in cheap (one-penny) instalments – to modern-day media meltdowns over slasher movies…

There is no substantial evidence to support that concern – audiences know that what they are watching is fiction. The psychological effects of violent media are still discussed by scholars and scientists, but the ‘monkey see, monkey do’ model of media psychology has been severely criticised on methodological and empirical grounds, and now seems to have been abandoned by most experts. In fact, one recent study covering the period 1960-2012 in the US found that, as movie violence went up, real-world violence actually went down.

A taste for horror is natural and should not be seen as pathological. Kids who are attracted to monster comics such as Tales from the Crypt (1950-55) and The Walking Dead (2003-19) are perfectly normal, as are teenagers who love slasher movies or adults who enjoy haunted attractions. That taste makes good sense from an evolutionary perspective. People evolved to be curious about danger, and they use stories to learn about the world and themselves. Horror stories specifically allow them to imaginatively simulate worst-case scenarios and teach them about the dark sides of the world, and about the dark spectrum of their own emotional lives…

Horror movies… can function as inoculation against the stresses and terrors of the world. They help us improve our coping skills, and they might function as a kind of enjoyable exposure therapy. There is also some preliminary evidence to suggest that people who suffer from anxiety disorders can find comfort in horror movies, presumably because these movies allow them to experience negative emotions in controlled and controllable doses, practise regulation strategies, and ultimately build resilience.

In addition to those psychological benefits, there might be social benefits of watching horror movies. Consider how scientists of religion have puzzled over the prevalence of painful religious rituals. Why do people fire-walk and pierce themselves with sharp objects in religious contexts? Apparently, one major function is that such psychologically and/or physically painful behaviours strengthen group identity and make group members more altruistic toward each other. You go through a painful experience together, which reinforces group bonds. It’s a similar story for horror entertainment…

Horror movies have gotten a bad rap, but watching them has surprisingly wholesome effects: “Fear not,” from Mathias Clasen (@MathiasClasen), director of the Recreational Fear Lab (@RecFearLab) at Aarhus University in Denmark.

* “Horror is a universal language; we’re all afraid. We’re born afraid, we’re all afraid of things: death, disfigurement, loss of a loved one. Everything that I’m afraid of, you’re afraid of and vice versa. So everybody feels fear and suspense. We were little kids once and so it’s taking that basic human condition and emotion and just f*cking with it and playing with it. You can invent new horrors.” – John Carpenter (in a 2015 interview with Interview Magazine)

###

As we shiver, we might send bracing birthday greetings to Robert Alphonse Picardo; he was born on this date in 1953. An actor probably most widely known known for his roles as Dr. Dick Richards on ABC’s China Beach; the Emergency Medical Hologram (EMH), also known as The Doctor, on Star Trek: Voyager; the Cowboy in Innerspace, Coach Cutlip on The Wonder Years (for which he received an Emmy nomination); and as Richard Woolsey in the Stargate television franchise.

But Picardo also has a distinguished resume in horror, having starred in The Howling (1981), Legend (1985), Munchies (1987), Bate’s Motel (1987), 976-EVIL (1988), Gremlins 2: The New Batch (1990), Matinee (1993), Tales from the Crypt (episode: “Till Death Do We Part”), Masters of Horror (episode: “Homecoming”), Sensored (2009), Monsterwolf (2010), Supernatural (episode: “Clap Your Hands If You Believe”), Trail of Blood (2011), Don’t Blink (2014), and Mansion of Blood (2015).

Picardo serves as a Planetary Society Board Member and host of The Planetary Post.

source

“People are trapped in history and history is trapped in them”*…

The late David Graeber (with his co-author David Wengrow), left one last book; William Deresiewicz gives us an early look…

Many years ago, when I was a junior professor at Yale, I cold-called a colleague in the anthropology department for assistance with a project I was working on. I didn’t know anything about the guy; I just selected him because he was young, and therefore, I figured, more likely to agree to talk.

Five minutes into our lunch, I realized that I was in the presence of a genius. Not an extremely intelligent person—a genius. There’s a qualitative difference. The individual across the table seemed to belong to a different order of being from me, like a visitor from a higher dimension. I had never experienced anything like it before. I quickly went from trying to keep up with him, to hanging on for dear life, to simply sitting there in wonder.

That person was David Graeber. In the 20 years after our lunch, he published two books; was let go by Yale despite a stellar record (a move universally attributed to his radical politics); published two more books; got a job at Goldsmiths, University of London; published four more books, including Debt: The First 5,000 Years, a magisterial revisionary history of human society from Sumer to the present; got a job at the London School of Economics; published two more books and co-wrote a third; and established himself not only as among the foremost social thinkers of our time—blazingly original, stunningly wide-ranging, impossibly well read—but also as an organizer and intellectual leader of the activist left on both sides of the Atlantic, credited, among other things, with helping launch the Occupy movement and coin its slogan, “We are the 99 percent.”

On September 2, 2020, at the age of 59, David Graeber died of necrotizing pancreatitis while on vacation in Venice. The news hit me like a blow. How many books have we lost, I thought, that will never get written now? How many insights, how much wisdom, will remain forever unexpressed? The appearance of The Dawn of Everything: A New History of Humanity is thus bittersweet, at once a final, unexpected gift and a reminder of what might have been. In his foreword, Graeber’s co-author, David Wengrow, an archaeologist at University College London, mentions that the two had planned no fewer than three sequels.

And what a gift it is, no less ambitious a project than its subtitle claims. The Dawn of Everything is written against the conventional account of human social history as first developed by Hobbes and Rousseau; elaborated by subsequent thinkers; popularized today by the likes of Jared Diamond, Yuval Noah Harari, and Steven Pinker; and accepted more or less universally. The story goes like this. Once upon a time, human beings lived in small, egalitarian bands of hunter-gatherers (the so-called state of nature). Then came the invention of agriculture, which led to surplus production and thus to population growth as well as private property. Bands swelled to tribes, and increasing scale required increasing organization: stratification, specialization; chiefs, warriors, holy men.

Eventually, cities emerged, and with them, civilization—literacy, philosophy, astronomy; hierarchies of wealth, status, and power; the first kingdoms and empires. Flash forward a few thousand years, and with science, capitalism, and the Industrial Revolution, we witness the creation of the modern bureaucratic state. The story is linear (the stages are followed in order, with no going back), uniform (they are followed the same way everywhere), progressive (the stages are “stages” in the first place, leading from lower to higher, more primitive to more sophisticated), deterministic (development is driven by technology, not human choice), and teleological (the process culminates in us).

It is also, according to Graeber and Wengrow, completely wrong. Drawing on a wealth of recent archaeological discoveries that span the globe, as well as deep reading in often neglected historical sources (their bibliography runs to 63 pages), the two dismantle not only every element of the received account but also the assumptions that it rests on. Yes, we’ve had bands, tribes, cities, and states; agriculture, inequality, and bureaucracy, but what each of these were, how they developed, and how we got from one to the next—all this and more, the authors comprehensively rewrite. More important, they demolish the idea that human beings are passive objects of material forces, moving helplessly along a technological conveyor belt that takes us from the Serengeti to the DMV. We’ve had choices, they show, and we’ve made them. Graeber and Wengrow offer a history of the past 30,000 years that is not only wildly different from anything we’re used to, but also far more interesting: textured, surprising, paradoxical, inspiring…

A brilliant new account upends bedrock assumptions about 30,000 years of change: “Human History Gets a Rewrite,” @WDeresiewicz introduces the newest– and last?– book from @davidgraeber and @davidwengrow. Eminently worth reading in full.

* James Baldwin

###

As we reinterpret, we might spare a thought for Vic Allen; he died on this date in 1914. A British human rights activist, political prisoner, sociologist, historian, economist and professor at the University of Leeds, he worked closely with British trade unions, and was considered a key player in the resistance against Apartheid in South Africa. He spent much of his life supporting the South African National Union of Mineworkers (NUM), and was a key mentor to British trade union leader Arthur Scargill, In 2010 Allen was awarded the Kgao ya Bahale award, the highest honor afforded by the South African Union of Miners. After his death he was widely commended by his fellow academics and activists for his lifelong commitment to worker’s rights and racial equality.

source

“Rather than heralding a new era of easy living, the Agricultural Revolution left farmers with lives generally more difficult and less satisfying than those of foragers”*…

Lest we wonder if climate change might have fundamental effects…

Why was agriculture invented? The long run advantages are clear: farming produced food surpluses that allowed population densities to rise, labor to specialize, and cities to be constructed. However, we still don’t know what motivated the transition in the short run. After 200,000 years of hunting and gathering, agriculture was invented independently at least seven times, on different continents, within a 7,000 year period. Archeologists agree that independent inventions occurred at least in the Fertile Crescent, Subsaharan Africa, North and South China, the Andes, Mexico, and North America. Moreover, the first farmers were shorter and had more joint diseases, suggesting that they ate less than hunter gatherers and worked more. Why would seven different human populations decide to adopt remarkably similar technologies, around the same time, and in spite of a lower standard of living?

I propose a new theory for the Neolithic Revolution, construct a model capturing its intuition, and test the resulting implications against a panel dataset of climate and adoption. I argue that the invention of agriculture was triggered by a large increase in climatic seasonality, which peaked approximately 12,000 years ago, shortly before the first evidence for agriculture appeared. This increase in seasonality was caused by well documented oscillations in the tilt of Earth’s rotational axis, and other orbital parameters. The harsher winters, and drier summers, made it hard for hunter-gatherers to survive during part of the year. Some of the most affected populations responded by storing foods, which in turn forced them to abandon their nomadic lifestyles, since they had to spend most of the year next to their necessarily stationary granaries, either stocking them, or drawing from them. While these communities were still hunter-gatherers, sedentarism and storage made it easier for them to adopt farming…

During the Neolithic Revolution, seven populations independently invented agriculture; a new paper argues that climate change was the cause: “The Ant and the Grasshopper: Seasonality and the Invention of Agriculture,” from Andrea Matranga (@andreamatranga)

[image above: source]

* Yuval Noah Harari, Sapiens: A Brief History of Humankind

###

As we reap what we sow, we might spare a thought for Clarence Birdseye; he died on this date in 1956.  An  inventor, entrepreneur, and naturalist, he was the founder of the modern frozen food industry.

On Arctic trips as a field naturalist for the United States government, he noticed that freshly caught fish, when placed onto the Arctic ice and exposed to the icy wind and frigid temperatures, froze solid almost immediately. He learned, too, that the fish, when thawed and eaten, still had all its fresh characteristics. He concluded that quickly freezing certain items kept large crystals from forming, preventing damage to their cellular structure. In 1922, Clarence organized his own company, Birdseye Seafoods, Inc., New York City, where he began processing chilled fish fillets.  He moved on to vegetables and other meats, then to the “fish stick,” along the way co-founding General Foods.  In the end, Birdseye had over 300 patents for creating and handling frozen food.

Clarence_Birdseye

 source

“An architect should live as little in cities as a painter. Send him to our hills, and let him study there what nature understands by a buttress, and what by a dome.”*…

We’ve misunderstood an important part of the history of urbanism– jungle cities. Patrick Roberts suggests that they have much to teach us…

Visions of “lost cities” in the jungle have consumed western imaginations since Europeans first visited the tropics of Asia, Africa and the Americas. From the Lost City of Z to El Dorado, a thirst for finding ancient civilisations and their treasures in perilous tropical forest settings has driven innumerable ill-fated expeditions. This obsession has seeped into western societies’ popular ideas of tropical forest cities, with overgrown ruins acting as the backdrop for fear, discovery and life-threatening challenges in countless films, novels and video games.

Throughout these depictions runs the idea that all ancient cities and states in tropical forests were doomed to fail. That the most resilient occupants of tropical forests are small villages of poison dart-blowing hunter-gatherers. And that vicious vines and towering trees – or, in the case of The Jungle Book, a boisterous army of monkeys – will inevitably claw any significant human achievement back into the suffocating green whence it came. This idea has been boosted by books and films that focus on the collapse of particularly enigmatic societies such as the Classic Maya. The decaying stone walls, the empty grand structures and the deserted streets of these tropical urban leftovers act as a tragic warning that our own way of life is not as secure as we would like to assume.

For a long time, western scholars took a similar view of the potential of tropical forests to sustain ancient cities. On the one hand, intensive agriculture, seen as necessary to fuel the growth of cities and powerful social elites, has been considered impossible on the wet, acidic, nutrient-poor soils of tropical forests. On the other, where the rubble of cities cannot be denied, in the drier tropics of North and Central America, south Asia and south-east Asia, ecological catastrophe has been seen as inevitable. Deforestation to make way for massive buildings and growing populations, an expansion of agriculture across marginal soils, as well as natural disasters such as mudslides, flooding and drought, must have made tropical cities a big challenge at best, and a fool’s gambit at worst.

Overhauling these stereotypes has been difficult. For one thing, the kind of large, multiyear field explorations usually undertaken on the sites of ancient cities are especially hard in tropical forests. Dense vegetation, mosquito-borne disease, poisonous plants and animals and torrential rain have made it arduous to find and excavate past urban centres. Where organic materials, rather than stone, might have been used as a construction material, the task becomes even more taxing. As a result, research into past tropical urbanism has lagged behind similar research in Mesopotamia and Egypt and the sweeping river valleys of east Asia.

Yet many tropical forest societies found immensely successful methods of food production, in even the most challenging of circumstances, which could sustain impressively large populations and social structures. The past two decades of archaeological exploration, applying the latest science from the land and the air, have stripped away canopies to provide new, more favourable assessments.

Not only did societies such as the Classic Maya and the Khmer empire of Cambodia flourish, but pre-colonial tropical cities were actually some of the most extensive urban landscapes anywhere in the pre-industrial world – far outstripping ancient Rome, Constantinople/Istanbul and the ancient cities of China.

Ancient tropical cities could be remarkably resilient, sometimes surviving many centuries longer than colonial- and industrial-period urban networks in similar environments. Although they could face immense obstacles, and often had to reinvent themselves to beat changing climates and their own exploitation of the surrounding landscape, they also developed completely new forms of what a city could be, and perhaps should be.

Extensive, interspersed with nature and combining food production with social and political function, these ancient cities are now catching the eyes of 21st-century urban planners trying to come to grips with tropical forests as sites of some of the fastest-growing human populations around the world today…

They may be vine-smothered ruins today, but the lost cities of the ancient tropics still have a lot to teach us about how to live alongside nature. Dr. Roberts (@palaeotropics) explains: “The real urban jungle: how ancient societies reimagined what cities could be,” adapted from his new book, Jungle: How Tropical Forests Shaped the World – and Us.

* John Ruskin

###

As we acclimate, we might send thoughtful birthday greetings to Sir Karl Raimund Popper; he was born on this date in 1902.  One of the greatest philosophers of science of the 20th century, Popper is best known for his rejection of the classical inductivist views on the scientific method, in favor of empirical falsification: a theory in the empirical sciences can never be proven, but it can be falsified, meaning that it can and should be scrutinized by decisive experiments.  (Or more simply put, whereas classical inductive approaches considered hypotheses false until proven true, Popper reversed the logic: conclusions drawn from an empirical finding are true until proven false.)

Popper was also a powerful critic of historicism in political thought, and (in books like The Open Society and Its Enemies and The Poverty of Historicism) an enemy of authoritarianism and totalitarianism (in which role he was a mentor to George Soros).

 source

“Human decision-making is complex. On our own, our tendency to yield to short-term temptations, and even to addictions, may be too strong for our rational, long-term planning.”*…

Many of us acknowledge that long-term thinking is a difficult, but necessary investment in a safe and happy future– our obligation to those who come after us. But it turns out that long-term thinking has more immediate benefits as well…

In times of global crisis, focusing on the present is justified. Yet as we move into 2021, there is good reason to spend some time also reflecting on our place within the longer-term past and future. For one, there remain creeping problems that we cannot ignore, such as climate change, antibiotic resistance or biodiversity loss. But also because contemplating deeper time can help replenish our mental energies during adversity, and offer a meditative source of catharsis amid the frenzy of the now.

In my research and writing, I explore the worldviews of nuclear waste experts in Finland, who reckon with radioactive isotopes over extremely long-term planetary timeframes. Plutonium-239 has a half-life of 24,100 years, whereas uranium-235’s half-life is over seven hundred million years. Like many anthropologists doing fieldwork within other cultures, my mission has been to uncover insights that could widen people’s perspectives in my own or other societies.

While the experiences of a nuclear waste expert may seem an unusual source of inspiration for well-being, this research has taught me that there can be personal benefits to stretching the intellect across time. Here’s how you might integrate some of these principles into your own life as you step into next year.

Doing Safety Case-inspired deep time exercises can not only help us imagine local landscapes over decades, centuries, and millennia. It can also help us take a step back from our everyday lives – transporting our minds to different places and times, and feeling rejuvenated when we return.

There are several benefits to this. Cognitive scientists have shown how creativity can be sparked by perceiving “something one has not seen before (but that was probably always there).” Corporate coaches have recommended taking breaks from our familiar thinking patterns to experience the world in new ways and overcome mental blocks. Contemplating deep time can cultivate a thoughtful appreciation of our species’ and planet’s longer-term histories and futures.

Yet it can also help us refresh during frazzled moments of unrest. Setting aside a few minutes each day for deep time contemplation can enrich us by evoking a momentary sense of awe. A Stanford University study has shown how awe can expand our sense of time and promote well-being. Anthropologist Barbara King has shown how awe can be “mind- and heart-expanding.”

Our challenge, then, is to discover, in ourselves, techniques for always bringing an awe-inspired awareness of deep time with us – wherever our futures may lead.

Taking inspiration from a far-sighted Finnish nuclear waste project, anthropologist Vincent Ialenti (@vincent_ialenti) explains why embracing Earth’s radical long-term can be good for well-being today: the benefits of embracing ‘deep time’ in a year like this.

* Peter Singer

###

As we find perspective and peace in being good ancestors, we might say alles Gute zum Geburtstag to Georg Christoph Lichtenberg; he was born on this date in 1742. Lichtenberg held the first professorship in Germany explicitly dedicated to experimental physics; he is remembered for his posthumously published notebooks, which he himself called sudelbücher, a description modelled on the English bookkeeping term “waste books” or “scrapbooks”, and for his discovery of tree-like electrical discharge patterns now called Lichtenberg figures.

One of the first scientists to introduce experiments with apparatus in their lectures, Lichtenberg was a popular and respected figure in contemporary European intellectual circles. He was one of the first to introduce Benjamin Franklin’s lightning rod to Germany by installing such devices in his house and garden sheds. He maintained relations with most of the great figures of that era, including Goethe and Kant. Ans was sought out by other leading scientists: Alessandro Volta visited Göttingen especially to see him and his experiments; mathematician Karl Friedrich Gauss sat in on his lectures.

But Lichtenberg was also an accomplished satirist, whose works put him in the highest ranks of German writers of the 18th century. And he proposed the standardized paper size system used globally today (except in Canada and the U.S.) defined by ISO 216, which has A4 as the most commonly used size.

Perhaps in time, the so-called Dark Ages will be thought of as including our own…

Georg Christoph Lichtenberg

source

%d bloggers like this: