From the piece featured below: “GDP per capita in Madagascar is about the same today as it was in 1950. As a consequence, the number of people in extreme poverty increased in line with the country’s population growth” (image source)
It’s easy to feel hope in the advances that the world has made in eraditcating extreme poverty over the last several decades. But as Max Roser writes, unless the poorest economies start growing, this period of progress against the worst form of poverty is over…
In the last decades, the world has made fantastic progress against extreme poverty. In 1990, 2.3 billion people lived in extreme poverty. Since then, the number of extremely poor people has declined by 1.5 billion people.
This means on any average day in the last 35 years, about 115,000 people left extreme poverty behind.1 Leaving the very worst poverty behind doesn’t mean a life free of want, but it does mean a big change. Additional income matters most for those who have the least. It means having the chance to leave hunger behind, to gain access to clean water, to access better healthcare, and to have at least some electricity — for light at night and perhaps even to cook and heat.
Can we expect this rapid progress to continue?
Unfortunately, we cannot. Based on current trends, progress against extreme poverty will come to a halt. As we’ll see, the number of people in extreme poverty is projected to decline, from 831 million people in 2025 to 793 million people in 2030. After 2030, the number of extremely poor people is expected to increase.
To understand why the rapid progress against deep poverty will not continue into the future, we need to know why the world made progress in the past.
Extreme poverty declined in the last three decades because, back in the 1990s, the majority of the poorest people on the planet lived in countries that subsequently achieved very fast economic growth. In Indonesia and China, more than two-thirds of the population lived in extreme poverty. But these economies then grew rapidly, so that by today, the share has declined to less than 10%. Other large Asian countries — including India, Pakistan, Bangladesh, and the Philippines — also achieved strong growth, and as a consequence, the share living in extreme poverty declined rapidly. Much of the progress happened in Asia, but conditions in other regions improved too: the share living in extreme poverty also declined in Ghana, Cape Verde, Cameroon, Panama, Bolivia, Mexico, Brazil, and many other countries.
This chart shows the economic change in these countries over the past decades. As incomes increased, the share of people in extreme poverty declined.
Share of population living in extreme poverty vs. GDP per capita, 1990 to 2024 (World Bank, Eurostat, OECD, IMF)
What is different today is that the majority of the world’s poorest people are stuck in economies that have been stagnating for a long time.Consider the case of Madagascar. In the long run, the country has not seen any growth at all: GDP per capita in Madagascar is about the same today as it was in 1950. As a consequence, the number of people in extreme poverty increased in line with the country’s population growth. In richer countries, it is possible to reduce poverty by reducing inequality through redistribution, but a country like Madagascar cannot reduce its share of people in extreme poverty through redistribution. This is because the mean income is lower than the poverty line; if everyone had the same income, everyone would be living in extreme poverty.
The situation is similar in other countries, as the chart below shows: in the Democratic Republic of Congo, Mozambique, Malawi, Burundi, and the Central African Republic, more than half of the population lives in extreme poverty. As their economies have stagnated, the deep poverty that most people live in has remained largely unchanged for decades.
This is why we have to expect the end of progress against extreme poverty based on current trends. If the poorest economies remain stagnant, hundreds of millions of people will continue to live in extreme poverty.
Share of population living in extreme poverty, 1992-2022 (World Bank)
I’m always skeptical when people say that we are at a juncture in history where the future looks much different than the past. But when it comes to the fight against extreme poverty, I fear it is true. Today, the majority of the world’s poorest people are living in economies that have not achieved economic growth in the recent past… Based on current trends, we have to expect the end of progress against extreme poverty…
… It’s no news that we should expect an end to progress against extreme poverty. This article is an update of an article I published in 2019, in which I wrote the same: the fact that the poorest economies are not growing means that the rapid progress against extreme poverty seen in the last decades will end.
Although this prospect has been known for years, it has hardly received the attention it deserves. Progress against extreme poverty was one of humanity’s most outstanding achievements of the past decades — the end of it would be one of the very worst realities of the coming ones.
Importantly, however, these projections are not predictions; their purpose is not to describe what the world in 2030 or 2040 will certainly look like. These projections describe what we have to expect based on current trends; they tell us about our present world rather than the reality of tomorrow. Current trends don’t have to become future facts: many countries left extreme poverty behind in the past, because they had a moment at which they broke out of stagnation.
What these projections tell us, however, is that if the poorest countries do not start to grow, a very bleak future is ahead of us: a future in which extreme poverty remains the reality for hundreds of millions for many years to come…
As we put our shoulders to the wheel, we might spare a thought for a man who contributed mightily to our capacity to feed humanity, Kenneth V. Thimann; he died on this date in 1997. A microbiologist, he was a pioneer in plant physiology (especially the hormones that control the development of plants). Building on the thinking of Frits Went, he identified the first plant hormone to be discovered– the first auxin, a class of growth hormones, and revealed its chemical structure– which proved very important to agriculture and its yields.
Tomorrow is, of course, Thanksgiving Day in the United States… and for many, an occasion to take “the cousin walk.” (R)D will be off for the day, returning (no doubt with a tryptophan hangover) on Friday.
Meantime, Alicia Kennedy on what’s become of “the foodie” and what it would mean to take taste serously again…
The foodie is in crisis. For forty years, the word itself has been hanging out in the culture, signifying a person who doesn’t just eat but knows what farm the arugula came from and which chef in town has the hottest pedigree. Where once the foodie had Anthony Bourdain roving the world in a leather jacket, telling them how to travel, what to eat, and how to be in restaurants, his death in 2018 left a hole that seemingly nothing in today’s food culture can fill. How does food emerge from its post-Bourdain malaise? Not even Stanley Tucci searching for Italy could resuscitate the culture into a consensus about who the foodie is now and what they care about.
Perhaps the foodie has become imperiled by the transformation of so many of our meals, snacks, and grocery hauls into mere fodder for social media. Preparing, serving, and eating food is now too often only a prelude to posting: the dimly lit dinner party featuring a mountain of whipped butter beside sourdough bread, the Saturday breakfast with an espresso cup placed just so upon the salmon newsprint of the Financial Times, a sun-drenched spread of shellfish on a trip to Lisbon—all in service to the almighty god of content. Being a foodie is no longer about experience and knowledge. Documentation is in; expertise is out, even if we can all cite Bourdain explaining that Sichuan food with Coke is the best way to cure a hangover.
The problem isn’t just about the domination of food culture by internet aesthetics. Instead, it’s about the way food enthusiasts use those aesthetics to curate away complexity and discomfort, leaving food systems unchallenged and food culture shallow. If all you want is a nice meal on the table, you don’t have to think about the overworked and underpaid farmworkers who made it possible. If you want pop history or recipes, you can gorge on them. This may all be perfectly pleasant. But what’s been lost in the process is the foodie’s potential power as both tastemaker and advocate…
[Kennedy consider two recent books: All Consuming: Why We Eat the Way We Eat Now, by Ruby Tandoh (former star of The Great British Bake Off) and Marion Nestle’s (author of Food Politics and originator of New York University’s Food Studies program) newly updated version of her 2006 classic, What to Eat Now. “Taken together, these books model what we’ve lost and point toward reclaiming it.” She then considers the late 20th century cultural history of food and foodies…]
… there’s a fundamental tension at the heart of foodie culture: everyone must eat, making food more universal than music or theater—yet class inequities shape how we do it, turning appetite into a marker of status. This is precisely why the term matters. Unlike other cultural identities, the foodie sits at the intersection of necessity and privilege, with the potential to bridge this divide—or to further entrench it.
Books like Tandoh’s and Nestle’s point toward closing that divide. They recognize that food can’t be detangled from industry and profit—that’s how it reaches our tables—but insist we look at the whole system. Behind the perfect peaches on social media feeds puppeteered by corporate algorithms are exploited farmworkers passing out from heatstroke. Behind every foodie is someone who just needs to eat, especially now that the federal government is fighting about SNAP. The question is whether those realities can coexist in our consciousness, or whether our fractured landscape will keep them separate.
For more than forty years, the word foodie has functioned as an inescapable shorthand for “someone who cares about food.” The shape that care takes is the real question. Nestle and Tandoh are arguing for rigorous care but in different ways: these books ask readers to remember the corporate and political power behind every option at the supermarket, and to be conscious of how various kinds of media are selling us certain sorts of gastronomic pleasure. Read in tandem, they ask us to be active participants in our daily meals beyond mere procurement. The first step toward a more conscientious foodie might be reclaiming the idea that our relationship to food exists not solely through recipes and memes but through power structures and systemic inequities that govern how food is grown, sold, and shared. A foodie’s appetite must have room for both pleasure and responsibility.
As we contemplate comestibles, we might recall that this date in 1789 was chosen by George Washington (on October 3rd of that year) as the ocassion of the young nation’s first official Thanksgiving.
Thanksgiving was first cellebrated as a regular national holiday on the fianl THursday in November, by proclamation of President Abraham Lincoln, on this date in 1863.
Read the full text of Washington’s proclamation here (and of Lincoln’s here).
… and if we’re not careful, we might not be too pleased with what we get. Sam Altman says the one-person billion-dollar company is coming. Evan Ratliff tells the tale of his attempt to build a completely AI-automated venture…
… If you’ve spent any time consuming any AI news this year—and even if you’ve tried desperately not to—you may have heard that in the industry, 2025 is the “year of the agent.” This year, in other words, is the year when AI systems are evolving from passive chatbots, waiting to field our questions, to active players, out there working on our behalf.
There’s not a well agreed upon definition of AI agents, but generally you can think of them as versions of large language model chatbots that are given autonomy in the world. They are able to take in information, navigate digital space, and take action. There are elementary agents, like customer service assistants that can independently field, triage, and handle inbound calls, or sales bots that can cycle through email lists and spam the good leads. There are programming agents, the foot soldiers of vibe coding. OpenAI and other companies have launched “agentic browsers” that can buy plane tickets and proactively order groceries for you.
In the year of our agent, 2025, the AI hype flywheel has been spinning up ever more grandiose notions of what agents can be and will do. Not just as AI assistants, but as full-fledged AI employees that will work alongside us, or instead of us. “What jobs are going to be made redundant in a world where I am sat here as a CEO with a thousand AI agents?” asked host Steven Bartlett on a recent episode of The Diary of a CEO podcast. (The answer, according to his esteemed panel: nearly all of them). Dario Amodei of Anthropic famously warned in May that AI (and implicitly, AI agents) could wipe out half of all entry-level white-collar jobs in the next one to five years. Heeding that siren call, corporate giants are embracing the AI agent future right now—like Ford’s partnership with an AI sales and service agent named “Jerry,” or Goldman Sachs “hiring” its AI software engineer, “Devin.” OpenAI’s Sam Altman, meanwhile, talks regularly about a possible billion-dollar company with just one human being involved. San Francisco is awash in startup founders with virtual employees, as nearly half of the companies in the spring class of Y Combinator are building their product around AI agents.
Hearing all this, I started to wonder: Was the AI employee age upon us already? And even, could I be the proprietor of Altman’s one-man unicorn? As it happens, I had some experience with agents, having created a bunch of AI agent voice clones of myself for the first season of my podcast, Shell Game.
I also have an entrepreneurial history, having once been the cofounder and CEO of the media and tech startup Atavist, backed by the likes of Andreessen Horowitz, Peter Thiel’s Founders Fund, and Eric Schmidt’s Innovation Endeavors. The eponymous magazine we created is still thriving today. I wasn’t born to be a startup manager, however, and the tech side kind of fizzled out. But I’m told failure is the greatest teacher. So I figured, why not try again? Except this time, I’d take the AI boosters at their word, forgo pesky human hires, and embrace the all-AI employee future…
Ratliff, the undefeated king of tech journalism stunts, is back with another banger: For this piece and the accompanying podcast series, he created a start-up staffed entirely by so-called AI agents. The agents can communicate by email, Slack, text and phone, both with Ratliff and among themselves, and they have free range to complete tasks like writing code and searching the open internet. Despite their capabilities, however, the whole project’s a constant farce. A funny, stupid, telling farce that says quite a lot about the future of work that many technologists envision now…
As we analyze autonomy, we might we might spare a jaundiced thought for Trofim Denisovich Lysenko; he died on this date in 1976. A Soviet biologist and agronomist, he believed the Mendelian theory of heredity to be wrong, and developed his own, allowing for “soft inheritance”– the heretability of learned behavior. (He believed that in one generation of a hybridized crop, the desired individual could be selected and mated again and continue to produce the same desired product, without worrying about separation/segregation in future breeds–he assumed that after a lifetime of developing (acquiring) the best set of traits to survive, those must be passed down to the next generation.)
In many way Lysenko’s theories recall Lamarck’s “organic evolution” and its concept of “soft evolution” (the passage of learned traits), though Lysenko denied any connection. He followed I. V. Michurin’s fanciful idea that plants could be forced to adapt to any environmental conditions, for example converting summer wheat to winter wheat by storing the seeds in ice. With Stalin’s support for two decades, he actively obstructed the course of Soviet biology, caused the imprisonment and death of many of the country’s eminent biologists who disagreed with him, and imposed conditions that contributed to the disastrous decline of Soviet agriculture and the famines that resulted.
Interestingly, some current research suggests that heritable learning– or a semblance of it– may in fact be happening by virtue of epigenetics… though nothing vaguely resembling Lysenko’s theory.
Hay, the foundation of the diet of grazing animals, is central to American agriculture. The USDA forecasts 2025 hay production at 123.5 million tons (grown on about 50 million acres countrywide), of which, about 3.24 tons are exported (generating over $1 Billion in revenue); the balance is consumed domestically.
Most of that hay is baled for storage and transport (in rectangular or round bales) using special (and increasingly expensive) equipment. But as Katie Hill reports, 115 years ago, before the advent of those motorized balers, a homegrown invention redefined stacking hay in the West. Some ranchers still see no reason to upgrade…
A scan of the horizon in Montana’s Big Hole Valley reveals plenty of examples of the land reclaiming what once belonged to it. Derelict jackleg fence. Log calving sheds with caving roofs. Rusting Chevrolets and spools of barbed wire. A giant compost pile of livestock carcasses, bones protruding from the mulch like seashells at low tide.
Then, every five miles or so, an old, spindly implement punctuates the scenery. It’s tall, maybe 30 feet, resembling a giant see-saw permanently out of balance. It’s not so much a stairway to heaven as it is a halted conveyor belt to nowhere; there’s no grain silo or corn crib nearby for a machine like this to fill up from above. Regardless, its efficacy in stacking giant piles of hay is clear from its construction. Grass grows tall around its base of rough-hewn lodgepoles, as if the earth might swallow it whole if it stayed put for another century.
This contraption [pictured at the top] is known as the beaverslide, patented in 1910 by Big Hole ranchers Herb Armitage and D.J. Stephens. The haystacking device consists of a wide, sliding fork at the base of a ramp and a cable pulley system rigged to the ramp’s underside. In practice, ranchers use a team of horses or a motorized vehicle with a winch to pull one of the cables perpendicular to the beaverslide, which in turn hoists the fork up the ramp, bringing a giant pile of hay up with it. (Ranchers rake cut hay onto the beaverslides with old buck rakes.) At the top of the ramp, the hay falls to the other side, forming three-story piles that can reach 25 tons in weight, depending on who you ask.
Details on the manufacturing and distribution of the beaverslide — named for its origins in Beaverhead County — are slim. The prevailing story is that ranchers often made their own, then made duplicates for neighboring ranches upon request, according to Big Hole rancher lore. Over the last few decades, the contraption has largely become a relic of a bygone era. But it’s not entirely obsolete, as some ranchers still use their old beaverslides today. With modern challenges like ballooning upgrade costs and the ever-present battle over a rancher’s right to repair their own equipment, the analog beaverslide makes more and more sense for those still using one with every passing hay season…
… he Kirkpatricks recall memories of neighbors being stuck in the middle of winter with broken-down bale processors and hungry cows. The closest repair shop in Jackson, an unincorporated community of roughly 20 people, is 42 miles south. The next closest shops or available technicians might be 53 miles away in Butte or 73 miles away in Dillon.
Many big-name mechanized implements run on trademarked chip technology that requires a trip to an authorized dealership for servicing. Even ranchers like Humbert who otherwise possess ample repair knowledge don’t have access to the diagnostic equipment necessary to solve problems on the fly. This might sound like sacrilege for an industry that lives and dies with rural, self-sufficient communities, but a bill calling for a rancher’s right to repair their own equipment died in the 2025 Montana legislature.
* A Tudor expression dating back to the mid-16th century, and used figuratively since 1673
###
As we honor old ways, we might recall that it was on this date in 1974 that Island Records released Country Life, the fourth studio album by Roxy Music.
Blue-stained serpentine Neotyphodium coenophialum mycelia inhabiting the intercellular spaces of tall fescue leaf sheath tissue. Magnified 400x.
Anna Marija Helt reports that, as global warming challenges tradtional agriculture, scientists are looking to “probiotics” for crops as a new green revolution in agriculture…
Potatoes contain something about which most people are entirely unaware: endophytes, which means “within plants.” Endophytes can also be found in other vegetables, fruits, and grains. In fact, all plants harbor endophytes in the form of bacteria, fungi, and other microbes.
… “It’s really amazing how strongly these endophytes can combat the fungal pathogens of crops,” [microbiologist Sharon] Doty says. And she notes regarding their growth-promoting effects, “It works in maize, in rice, in tomatoes, in bell peppers, and strawberries.” Her team has also isolated endophytes from sweet potatoes that improve the rooting of poplars, a promising biofuels crop.
Endophytes confer additional traits useful for a changing planet. For example, those from geothermal habitats can confer heat tolerance, based on studies led by geneticist Regina Redman. And crop physiologist K. M. Manasa demonstrated salt-tolerance in rice plants inoculated with an endophyte from seaside plants. Rice is salt-sensitive and one of the world’s main food crops. But increasing soil salinity is impacting a fifth of farmable land globally due to climate change and human water and land use practices…
Nitrogen is often the most limiting soil nutrient for crops, something nineteenth-century farmers recognized. Agronomist and Nobel Prize nominee Johanna Döbereiner discovered nitrogen-fixing endophytes in non-legume plants in the twentieth century that, like rhizobia, might reduce the need for financially and environmentally costly synthetic fertilizers. Many of the endophytes Doty has characterized over twenty-five years fix nitrogen and promote growth in lab, greenhouse, and field trials but have a much broader host range than rhizobia, extending from farm lands to forests…
… Developing real-world endophyte applications is a complicated challenge, but a necessary one given the need for more productive and sustainable agriculture. In the meantime, skeptical farmers are getting onboard.
“There’s a lot of conversations going on between researchers and farmers,” says Friesen, to “move the needle on our understanding of these processes that are so important for soil health but also plant health and the stability and security of our food supply.”…
As we muse on microbes, we might send healthy birthday greetings to John Boyd Orr (1st Baron Boyd-Orr); he was born on this date in 1880. A teacher, medical doctor, biologist, nutritional physiologist, politician, businessman, and farmer, he was awarded the Nobel Peace Prize in 1949 for his scientific research into nutrition and for his work as the first Director-General of the United Nations Food and Agriculture Organization.
You must be logged in to post a comment.