“You get what you measure”*…
Matt Stoller takes the occasion of Trump’s selection of Kevin Warsh to head the Fed (“an orthodox Wall Street GOP pick, though he is married to the billionaire heiress of the Estee Lauder fortune and was named in the Epstein files. He’s perceived not as a Trump loyalist but as an avatar of capital”) to ponder why public satisfaction with the economy is so low (“if you judge solely by consumer sentiment, Trump’s first term was the third best economy Americans experienced since 1960. Trump’s second term is not only worse than his first, it is the worst economic management ever recorded by this indicator”).
Stoller argues that we’re mesuring the wrong things (or, in some cases, the right things in the wrong ways)…
… the models underpinning how policymakers think about the economy just don’t reflect the realities of modern commerce. The fundamental dynamic is that those models were constructed in an era where America was one discrete economy, with Wall Street and the public tied together by the housing finance system. But today, Americans increasingly live in tiered bubbles that have less and less to do with one another. Warsh will essentially be looking at the wrong indicators, pushing buttons that are mislabeled.
While corporate America is experiencing good times, much of the country is experiencing recessionary conditions. Let’s contrast consumer sentiment indicators with statistics showing an economic boom. Last week, the government came out with stats on real gross domestic product increasing at a scorching 4.4% in the third quarter of last year. There’s higher consumer spending, corporate investment, government spending, and a better trade balance. Inflation, according to the Consumer Price Index, is low at 2.6.% over the past year. And while official numbers aren’t out for the final three months of the year, the Atlanta Fed’s GDPNow forecast shows that it estimates growth at 4.2%. And there are other indicators showing prosperity, from low unemployment to high business formation, which was up about 8% last year, as well as record corporate profits…
… Behavioral economists and psychologists have all sorts of reasons to explain that people don’t really understand the economy particularly well. But in general, when the stats and the public mood conflict, I believe the public is usually correct. Often, there are some weird anomalies with the data used by policymakers. In 2023, I noticed that the consumer price index, the typical measure of inflation, didn’t account for borrowing costs, so the Fed hike cycle, which caused increases in credit card, mortgage, auto loan, payday loans, et al, just wasn’t incorporated. The public wasn’t mad at phantom inflation, they were mad at real inflation that the “experts” didn’t see.
I don’t think that’s the only miscalculation…
[Stoller goes on to explain the ways in which “consumer spending” doesn’t tell us much about consumers anymore, about the painful reality of “spending inequality,” and about the obscure(d) problem of monopoly-driven inflation. He concludes…]
… Finally, there’s a more philosophical point, which I don’t think explains the short-term frustrations people feel, but is directionally correct. Do people actually want what the economy is producing? For most of the 20th century, the answer was yes. When Simon Kuznets invented these measurement statistics in 1934, financial value and the value that Americans placed on products and services were similar. A bigger economy meant things like toilets and electricity spreading across rural America, and cars and food and washing machines.
Today? Well, that’s less clear. According to the Bureau of Labor Statistics, the second fastest growing sector of the economy in terms of GDP growth from 2019-2024 was gambling. Philip Pilkington wrote a good essay last summer on the moral assumptions behind our growth statistics. There is no agreed upon notion of what makes up an economically valuable object or activity, so our stats are inherently subtle moral judgments. Classic moral philosophers like Adam Smith believed in the “use value” of an item, meaning how it could be used, whereas neoclassical economists believed in the “exchange value” of an item, making no judgments about use and are just counting up its market price.
Normal people subscribe on a moral level to use value. Most of us see someone spending money on a gambling addiction as doing something worse than providing Christmas presents for kids, but not because of price. However, our GDP models use the market value basis. Kuznets, presumably, was not amoral, he just thought that our laws would ban immoral activities like gambling, and so use value and market value wouldn’t diverge. But they have.
It’s not just things like gambling or pornography or speculation. A lot of previously unmeasured activity has been turned into data and monetized, which isn’t actually increasing real growth but measuring what already existed. Take the change from meeting someone at a party to using a dating app. One is part of GDP, the other isn’t. Both are real, but only one would show a bigger economy.
Beyond that much of our economy is now based on intangibles – the fastest growing sector was software publishing. Is Microsoft moving to a subscription fee model for Office truly some sort of groundbreaking new product? It’s hard to say, while corporate assets used to be hard things like factories, today much of it is intangibles like intellectual property.
A boomcession, where the rich and corporate America experience a boom while working people feel a recession, is a very unhealthy dynamic. It’s certainly possible to create metrics to measure it, and to help policymakers understand real income growth among different subgroups. You could start looking at real income after non-discretionary consumer spending, or find ways of adjusting for price discrimination.
But I think a better approach is to try to knit us into one society again. The kinds of policymakers who could try to create metrics to understand the different experiences of classes, and ameliorate them, don’t have power. Instead, the people in charge still use models which presume one economy and one relatively uniform set of prices, where “consumer spending” means stuff consumers want.
I once noted a speech in 2016 by then-Fed Chair Janet Yellen in which she expressed surprise that powerful rich firms and small weak ones had different borrowing rates, which affected the “monetary transmission channel” the Fed relied on. Sure it was obvious in the real world, but she preferred theory.
Or they don’t use models at all; Kevin Warsh is not an economist, he’s a lawyer and political operative, and is uninterested in academic theory. He cares about corporate profits and capital formation. That probably won’t work out well either.
At any rate, we have to start measuring what matters again. If we don’t, then we’ll continue to be baffled that normal people hate the economy that looks fine on our charts…
The models used by policymakers to understand wages, economic growth, and consumer spending are misleading. That’s why corporate America is having a party, and everyone else is mad. Eminently worth reading in full: “The Boomcession: Why Americans Hate What Looks Like an Economic Boom,” from @matthewstoller.bsky.social (or @mattstoller.skystack.xyz).
* Richard Hamming (and also to the article above, see “Goodhart’s law“)
###
As we ponder the pecuniary, we might recall that it was on this date in 1958 that Benelux Economic Union was founded, creating the seed from the European Economic Community, then the European Union grew.
On that same day, Philadelphia doo wop group The Silhouettes started five weeks at the top of the Billboard R&B chart with their first single, “Get A Job.”
“I call our world Flatland, not because we call it so, but to make its nature clearer to you, my happy readers, who are privileged to live in Space.”*…
Physicists believe a third class of particles – anyons – could exist, but only in 2D. As Elay Shech asks, what kind of existence is that?…
Everything around you – from tables and trees to distant stars and the great diversity of animal and plant life – is built from a small set of elementary particles. According to established scientific theories, these particles fall into two basic and deeply distinct categories: bosons and fermions.
Bosons are sociable. They happily pile into the same quantum state, that is, the same combination of quantum properties such as energy level, like photons do when they form a laser. Fermions, by contrast, are the introverts of the particle world. They flat out refuse to share a quantum state with one another. This reclusive behaviour is what forces electrons to arrange themselves in layered atomic shells, ultimately giving rise to the structure of the periodic table and the rich chemistry it enables.
At least, that’s what we assumed. In recent years, evidence has been accumulating for a third class of particles called ‘anyons’. Their name, coined by the Nobel laureate Frank Wilczek, gestures playfully at their refusal to fit into the standard binary of bosons and fermions – for anyons, anything goes. If confirmed, anyons wouldn’t just add a new member to the particle zoo. They would constitute an entirely novel category – a new genus – that rewrites the rules for how particles move, interact, and combine. And those strange rules might one day engender new technologies.
Although none of the elementary particles that physicists have detected are anyons, it is possible to engineer environments that give rise to them and potentially harness their power. We now think that some anyons wind around one another, weaving paths that store information in a way that’s unusually hard to disturb. That makes them promising candidates for building quantum computers – machines that could revolutionise fields like drug discovery, materials science, and cryptography. Unlike today’s quantum systems that are easily disturbed, anyon-based designs may offer built-in protection and show real promise as building blocks for tomorrow’s computers.
Philosophically, however, there’s a wrinkle in the story. The theoretical foundations make it clear that anyons are possible only in two dimensions, yet we inhabit a three-dimensional world. That makes them seem, in a sense, like fictions. When scientists seek to explore the behaviours of complicated systems, they use what philosophers call ‘idealisations’, which can reveal underlying patterns by stripping away messy real-world details. But these idealisations may also mislead. If a scientific prediction depends entirely on simplification – if it vanishes the moment we take the idealisation away – that’s a warning sign that something has gone wrong in our analysis.
So, if anyons are possible only through two-dimensional idealisations, what kind of reality do they actually possess? Are they fundamental constituents of nature, emergent patterns, or something in between? Answering these questions means venturing into the quantum world, beyond the familiar classes of particles, climbing among the loops and holes of topology, detouring into the strange physics of two-dimensional flatland – and embracing the idea that apparently idealised fictions can reveal deeper truths…
[Shech explains anyons, and considers the various strategies for making sense of them. (They”paraparticles” like anyons don’t actually exit. Or we simply lack the theoretical framwork and experimental work to follow to find them. Or in ultra-thin materials physics, we’ve already found them.) Considering the latter two possibilities, he concludes…]
So, if anyons exist, what kind of existence is it? None of the elementary particles are anyons. Instead, physicists appeal to the notion of ‘quasiparticles’, in which large numbers of electrons or atoms interact in complex ways and behave, collectively, like a simpler object you can track with novel behaviours.
Picture fans doing ‘the wave’ in a stadium. The wave travels around the arena as if it’s a single thing, even though it’s really just people standing and sitting in sequence. In a solid, the coordinated motion of many particles can act the same way – forming a ripple or disturbance that moves as if it were its own particle. Sometimes, the disturbance centres on an individual particle, like an electron trying to move through a material. As it bumps into nearby atoms and other electrons, they push back, creating a kind of ‘cloud’ around it. The electron plus its cloud behave like a single, heavier, slower particle with new properties. That whole package is also treated as a quasiparticle.
Some quasiparticles behave like bosons or fermions. But for others, when two of them trade places, the system’s quantum state picks up a built-in marker that isn’t limited to the two familiar settings. It can take on intermediate values, which means novel quantum statistics. If the theories describing these systems are right, then the quasiparticles in question aren’t just behaving oddly, they are anyons: the third type of particles.
In other words, while none of the elementary particles that physicists have detected are anyons – physicists have never ‘seen’ an anyon in isolation – we can engineer environments that give rise to emergent quasiparticles portraying the quantum statistics of anyons. In this sense, anyons have been experimentally confirmed. But there are different kinds of anyons, and there is still active work being done on the more exotic anyons that we hope to harness for quantum computers.
But even so, are quasiparticles, like anyons, really real? That depends. Some philosophers argue that existence depends on scale. Zoom in close enough, and it makes little sense to talk about tables or trees – those objects show up only at the human scale. In the same way, some particles exist only in certain settings. Anyons don’t appear in the most fundamental theories, but they show up in thin, flat systems where they are the stable patterns that help explain real, measurable effects. From this point of view, they’re as real as anything else we use to explain the world.
Others take a more radical stance. They argue that quasiparticles, fields and even elementary particles aren’t truly real: they’re just useful labels. What really exists is not stuff but structure: relations and patterns. So ‘anyons’ are one way we track the relevant structure when a system is effectively two-dimensional.
Questions about reality take us deep into philosophy, but they also open the door to a broader enquiry: what does the story of anyons reveal about the role of idealisations and fictions in science? Why bother playing in flatland at all?
Often, idealisations are seen as nothing more than shortcuts. They strip away details to make the mathematics manageable, or serve as teaching tools to highlight the essentials, but they aren’t thought to play a substantive role in science. On this view, they’re conveniences, not engines of discovery.
But the story of anyons shows that idealisations can do far more. They open up new possibilities, sharpen our understanding of theory, clarify what a phenomenon is supposed to be in the first place, and sometimes even point the way to new science and engineering.
The first payoff is possibility: idealisation lets us explore a theory’s ‘what ifs’, the range of behaviours it allows even if the world doesn’t exactly realise them. When we move to two dimensions, quantum mechanics suddenly permits a new kind of particle choreography. Not just a simple swap, but wind-and-weave novel rules for how particles can combine and interact. Thinking in this strictly two-dimensional setting is not a parlour trick. It’s a way to see what the theory itself makes possible.
That same detour through flatland also assists us in understanding the theory better. Idealised cases turn up the contrast knobs. In three dimensions, particle exchanges blur into just two familiar options of bosons and fermions. In two dimensions, the picture sharpens. By simplifying the world, the idealisation makes the theory’s structure visible to the naked eye.
Idealisation also helps us pin down what a phenomenon really is. It separates difference-makers from distractions. In the anyon case, the flat setting reveals what would count as a genuine signature, say, a lasting memory of the winding of particles, and what would be a mere lookalike that ordinary bosons or fermions could mimic. It also highlights contrasts with other theoretical possibilities: paraparticles, for example, don’t depend on a two-dimensional world, but anyons seem to. That contrast helps identify what belongs to the essence of anyons and what does not. When we return to real materials, we know what to look for and what to ignore.
Finally, idealisations don’t just help us read a theory – they help write the next one. If experiments keep turning up signatures that seem to exist only in flatland, then what began as an idealisation becomes a compass for discovery. A future theory must build that behaviour into its structure as a genuine, non-idealised possibility. Sometimes, that means showing how real materials effectively enforce the ideal constraint, such as true two-dimensionality. Other times, it means uncovering a new mechanism that reproduces the same exchange behaviour without the fragile assumptions of perfect flatness. In both cases, idealisation serves as a guide for theory-building. It tells us which features must survive, which can bend, and where to look for the next, more general theory.
So, when we venture into flatland to study anyons, we’re not just simplifying – we’re exploring the boundaries where mathematics, matter and reality meet. The journey from fiction to fact may be strange, but it’s also how science moves forward…
Eminently worth reading in full: “Playing in flatland,” from @elayshech.bsky.social in @aeon.co.
Pair with: “Is Particle Physics Dead, Dying, or Just Hard?“
* Edwin A. Abbott, Flatland: A Romance of Many Dimensions
###
As we brood over the boundaries of “being” (and knowing), we might spare a thought for Bertand Russell; he died on this date in 1970. A philosopher, logician, mathematician, and public intellectual, he influenced mathematics, logic, and several areas of analytic philosophy.
He was one of the early 20th century’s prominent logicians and a founder of analytic philosophy, along with his predecessor Gottlob Frege, his friend and colleague G. E. Moore, and his student and protégé Ludwig Wittgenstein. Russell with Moore led the British “revolt against idealism“. Together with his former teacher Alfred North Whitehead, Russell wrote Principia Mathematica, a milestone in the development of classical logic and a major attempt [if ultimately unsuccessful, pace Godel] to reduce the whole of mathematics to logic. Russell’s article “On Denoting” is considered a “paradigm of philosophy.”









You must be logged in to post a comment.