(Roughly) Daily

Archive for February 2024

“We couldn’t build quantum computers unless the universe were quantum and computing… We’re hacking into the universe.”*…

… in the process of which, as Ben Brubaker explains, we learn some fascinating things…

If you want to tile a bathroom floor, square tiles are the simplest option — they fit together without any gaps in a grid pattern that can continue indefinitely. That square grid has a property shared by many other tilings: Shift the whole grid over by a fixed amount, and the resulting pattern is indistinguishable from the original. But to many mathematicians, such “periodic” tilings are boring. If you’ve seen one small patch, you’ve seen it all.

In the 1960s, mathematicians began to study “aperiodic” tile sets with far richer behavior. Perhaps the most famous is a pair of diamond-shaped tiles discovered in the 1970s by the polymathic physicist and future Nobel laureate Roger Penrose. Copies of these two tiles can form infinitely many different patterns that go on forever, called Penrose tilings. Yet no matter how you arrange the tiles, you’ll never get a periodic repeating pattern.

“These are tilings that shouldn’t really exist,” said Nikolas Breuckmann, a physicist at the University of Bristol.

For over half a century, aperiodic tilings have fascinated mathematicians, hobbyists and researchers in many other fields. Now, two physicists have discovered a connection between aperiodic tilings and a seemingly unrelated branch of computer science: the study of how future quantum computers can encode information to shield it from errors. In a paper posted to the preprint server arxiv.org in November, the researchers showed how to transform Penrose tilings into an entirely new type of quantum error-correcting code. They also constructed similar codes based on two other kinds of aperiodic tiling.

At the heart of the correspondence is a simple observation: In both aperiodic tilings and quantum error-correcting codes, learning about a small part of a large system reveals nothing about the system as a whole…

Fascinating: “Never-Repeating Tiles Can Safeguard Quantum Information,” from @benbenbrubaker in @QuantaMagazine.

Plus- bonus background on tiling.

* “We couldn’t build quantum computers unless the universe were quantum and computing. We can build such machines because the universe is storing and processing information in the quantum realm. When we build quantum computers, we’re hijacking that underlying computation in order to make it do things we want: little and/or/not calculations. We’re hacking into the universe.” –Seth Lloyd

###

As we care for qubits, we might send carefully-calculated birthday greetings to Herman Hollerith; he was born on this date in 1860. A statistician and inventor, he was a seminal figure in the development of data processing: he invented (for the 1890 U.S. Census) an electromechanical tabulating machine for punched cards to assist in summarizing information (and, later, for use in accounting). His invention of the punched card tabulating machine, which he patented in 1884, marked the beginning of the era of mechanized binary code and semiautomatic data processing systems– and his approach dominated that landscape for nearly a century.

The company that Hollerith founded to exploit his invention was merged in 1911 with several other companies to form the Computing-Tabulating-Recording Company. In 1924, the company was renamed “International Business Machines” (or, as we know it, IBM).

source

“The clustering of technological innovation in time and space helps explain both the uneven growth among nations and the rise and decline of hegemonic powers”*…

As scholars like Robert Gordon and Tyler Cowan have begun to call out a slowing of progress and growth in the U.S., others are beginning to wonder if “innovation clusters” like Silicon Valley are still advantageous. For example, Brian J. Asquith

In 2011, the economist Tyler Cowen published The Great Stagnation, a short treatise with a provocative hypothesis. Cowen challenged his audience to look beyond the gleam of the internet and personal compu­ting, arguing that these innovations masked a more troubling reality. Cowen contended that, since the 1970s, there has been a marked stagna­tion in critical economic indicators: median family income, total factor productivity growth, and average annual GDP growth have all plateaued…

In the years since the publication of the Great Stagnation hypothesis, others have stepped forward to offer support for this theory. Robert Gordon’s 2017 The Rise and Fall of American Growth chronicles in engrossing detail the beginnings of the Second Industrial Revolution in the United States, starting around 1870, the acceleration of growth spanning the 1920–70 period, and then a general slowdown and stagnation since about 1970. Gordon’s key finding is that, while the growth rate of average total factor productivity from 1920 to 1970 was 1.9 percent, it was just 0.6 percent from 1970 to 2014, where 1970 represents a secular trend break for reasons still not entirely understood. Cowen’s and Gordon’s insights have since been further corroborated by numerous research papers. Research productivity across a variety of measures (researchers per paper, R&D spending needed to maintain existing growth rates, etc.) has been on the decline across the developed world. Languishing productivity growth extends beyond research-intensive industries. In sectors such as construction, the value added per worker was 40 percent lower in 2020 than it was in 1970. The trend is mirrored in firm productivity growth, where a small number of superstar firms see exceptionally strong growth and the rest of the distribution increasingly lags behind.

A 2020 article by Nicholas Bloom and three coauthors in the American Economic Review cut right to the chase by asking, “Are Ideas Getting Harder to Find?,” and answered its own question in the affirm­ative.6 Depending on the data source, the authors find that while the number of researchers has grown sharply, output per researcher has declined sharply, leading aggregate research productivity to decline by 5 percent per year.

This stagnation should elicit greater surprise and concern because it persists despite advanced economies adhering to the established eco­nomics prescription intended to boost growth and inno­vation rates: (1) promote mass higher education, (2) identify particularly bright young people via standardized testing and direct them to re­search‑intensive universities, and (3) pipe basic research grants through the university system to foster locally-driven research and development networks that supercharge productivity…

… the tech cluster phenomenon stands out because there is a fundamental discrepancy between how the clusters function in practice versus their theoretical contributions to greater growth rates. The emergence of tech clusters has been celebrated by many leading economists because of a range of findings that innovative people become more productive (by various metrics) when they work in the same location as other talented people in the same field. In this telling, the essence of innovation can be boiled down to three things: co-location, co-location, co-location. No other urban form seems to facili­tate innovation like a cluster of interconnected researchers and firms.

This line of reasoning yields a straightforward syllogism: technology clusters enhance individual innovation and productivity. The local na­ture of innovation notwithstanding, technologies developed within these clusters can be adopted and enjoyed globally. Thus, while not everyone can live in a tech cluster, individuals worldwide benefit from new advances and innovations generated there, and some of the outsized economic gains the clusters produce can then be redistributed to people outside of the clusters to smooth over any lingering inequalities. There­fore, any policy that weakens these tech clusters leads to a diminished rate of innovation and leaves humanity as a whole poorer.

Yet the fact that the emergence of the tech clusters has also coincided with Cowen’s Great Stagnation raises certain questions. Are there shortcomings in the empirical evidence on the effects of the tech clusters? Does technology really diffuse across the rest of the economy as many economists assume? Do the tech clusters inherently prioritize welfare-enhancing technologies? Is there some role for federal or state action to improve the situation? Clusters are not unique to the postwar period: Detroit famously achieved a large agglomeration economy based on automobiles in the early twentieth century, and several authors have drawn parallels between the ascents of Detroit and Silicon Valley. What makes today’s tech clusters distinct from past ones? The fact that the tech clusters have not yielded the same society-enhancing benefits that they once promised should invite further scrutiny…

How could this be? What can we do about it? Eminently worth reading in full: “Superstars or Black Holes: Are Tech Clusters Causing Stagnation?” (possible soft paywall), from @basquith827.

See also: Brad DeLong, on comments from Eric Schmidt: “That an externality market failure is partly counterbalanced and offset by a behavioral-irrationality-herd-mania cognitive failure is a fact about the world. But it does not mean that we should not be thinking and working very hard to build a better system—or that those who profit mightily from herd mania on the part of others should feel good about themselves.”

* Robert Gilpin

###

As we contemplate co-location, we might recall that it was on this date in 1956 that a denizen of one of America’s leading tech/innovation hubs, Jay Forrester at MIT [see here and here], was awarded a patent for his coincident current magnetic core memory (Patent No. 2,736,880). Forrester’s invention, a “multicoordinate digital information storage device,” became the standard memory device for digital computers until supplanted by solid state (semiconductor) RAM in the mid-1970s.

source

“Conceal me what I am, and be my aid for such disguise as haply shall become the form of my intent”*…

Jurisdictional triage…

The website for the shipping registry of Eswatini (formerly Swaziland), established in October 2023, appears much like those of more established seafaring nations. A picture of vast cruise ships sits alongside promises of the “highest quality ship maritime services and ship registrations”. Delve deeper though and Eswatini’s nautical credentials start to unravel. For one thing, the African country is landlocked, calling into doubt the assertion that the port of Mbabane, Eswatini’s capital, is situated on the coast of South Africa. It is a “dry port”, 150km from the sea and 30km from a rail link to Maputo on Mozambique’s Indian Ocean coast. Its stated ability to handle “containers, bulk carriers and tankers” seems questionable.

The country is following in the wake of other smaller nations that offer their flag to shipowners. Seagoing vessels are obliged by maritime law to fly a flag of a country of registration and stateless vessels are not protected by international law. Yet the days when the stern of a ship would fly a national flag connected to the ownership of the vessel are long past. Liberia, Panama and the Marshall Islands now account for nearly half of the global fleet, by tonnage. Countries with loose ties to seafaring have been dubbed “flags of convenience” for levying low or no taxes and offering an escape from burdensome labour laws and other regulatory requirements. Often administered by private companies based elsewhere, these registries are a handy source of additional revenue for small and poor countries.

Registering a merchant vessel with a jurisdiction that is a mere speck on the map is not necessarily a cause for concern. Many take seriously their responsibility to oversee adherence to the rules and regulations of the high seas. Liberia’s, based near Washington, dc, has a good record of maintaining global standards across its fleet. Other registries merely give a “façade of legal oversight” says Richard Meade, editor of Lloyd’s List Intelligence, a trade publication. A blacklist complied by Paris mou, an organisation that aims to “eliminate the operation of substandard ships”, puts the likes of Cameroon, Vanuatu and Comoros near the bottom…

Less diligent registries are helping to fuel the growth of a “dark fleet”—some 1,400 vessels, according to the Atlantic Council, a think-tank—that operates with little regulatory oversight. They are mostly oil tankers that engage in subterfuge to hide where they are and the origin of their cargo in order to evade sanctions on Russian crude oil. Ownership is often opaque. Mr Meade estimates that 12% of the global tanker fleet is now dark. He notes that Gabon’s registry, now comprising 140 vessels, is the fastest-growing in the world thanks largely to the reflagging of Russian tankers.

An expanding dark fleet poses a danger to itself and other vessels. Dark ships tend to be old and less well maintained, and some may be uninsured. Practices such as turning off or “spoofing” location devices are a danger to other ships. Swapping oil cargoes at sea to obscure their origins poses the danger of a spillage. Mr Meade foresees a worse calamity of a large “dark fleet” tanker sinking in an environmentally sensitive area, with no accountability…

Sea-going chicanery: “Why does landlocked Eswatini have a ship registry?” (gift article) from @TheEconomist.

* Shakespeare, Twelfth Night

###

As we deconstruct disguises, we might recall that it was on this date in 1617 that  Sweden and the Tsardom of Russia signed the Treaty of Stolbovo, ending the Ingrian War and shutting Russia out of the Baltic Sea… until 1703, when Peter the Great won back access in battle with the Swedes– a victory he cemented by founding St. Petersburg.

The Baltic Region after the Treaty of Stolbovo (source)
The Baltic Region today (source)

“Only in our speaking with one another does the world, as that about which we speak, emerge in its objectivity and visibility from all sides”*…

Blake Smith on trail-blazing publisher Michael Denneny and his embodiment of his mentor’s– Hannah Arendt‘s– thought…

Michael Denneny, the recently deceased co-founder and co-editor of the pioneering gay magazine Christopher Street , gay newspaper New York Native , and the gay publishing line at St. Martin’s Press, Stonewall Inn Editions, began his recently published collection of essays On Christopher Street with a quotation from his mentor, Hannah Arendt:

Only in our speaking with one another does the world, as that about which we speak, emerge in its objectivity and visibility from all sides. Living in a real world and speaking with one another about it are basically one and the same.

Denneny’s career as a gay cultural activist was a way of putting into practice Arendt’s thought as condensed in this citation…

Arendt argued throughout her work, although with critically shifting emphases, that the possibility of political freedom for society as whole depends on particular groups within it being able to constitute distinct “worlds” in which their members can exchange perspectives, debate their common interests, and face the wider “world” composed of other groups. That is, a healthy society is diverse in the sense of being made up of individual units like economic classes and religious and ethnic minorities (represented by associations, trade unions, churches etc.), which are themselves characterized by internal diversity and lively debate.

Diversity and debate prevent, in a logic familiar from Montesquieu and Madison, the emergence of a single all-powerful leader or stifling consensus. In such accounts, which form the basis for American political common sense today, we imagine minorities as homogenous interest groups, which, in the play of their rival ambitions, keep each other in check, through a kind of balance of power akin to that at work in international relations. Politicized minorities, each pursuing its collective interests, can, if their debates and rivalries are properly channeled, be a force for good in politics.  

Arendt’s argument is substantively different. In her account, minorities are important not insofar as they are internally unified groups engaged in the play of countervailing interests and powers, but rather insofar as they are internally heterogeneous groups whose very diversity offers a sort of school in which citizens learn how to have judgment: the capacity to express and exchange ideas without appeal to fixed rules. Differences within “our own groups”—our everyday experiences of debates with other people “like us” in the spaces of our associational life (synagogues, union halls, gay bars, etc.) prepare us for the still more challenging experiences of disagreement in our wider political life, where we cannot necessarily trust that our interlocutors share our identities, experiences, and goals.  

Indeed, the experience of uncertainty is constitutive of politics, as Arendt saw it. Politics is one of a number of domains, she argued, in which we cannot call upon, in the course of our mutual questioning about what is to be done, anything like a logical principle (2+2=4) that all rational beings might recognize or a universally agreed-upon norm that all, or nearly all, members of our community do recognize. In these domains we are obligated to, as she often says, “woo” each other, to practice the arts of rhetorical seduction—which does not mean in her account, that we are in debates over politics merely practicing sophistry.

Rather, we are—as we find ourselves constantly doing in our most quotidian, non-political conversations—appealing to each other to share perspectives (Look!, we say, don’t you see?), on the assumption that each of us is positioned differently, because of our experiences, knowledge, interests, etc., in relation to a field of objects to which we all refer. We assume, in other words, that our divergent perspectives are perspectives on something, on the same things, and that we can by discussing them, inviting our interlocutors into our position by rendering it in speech, and projecting ourselves through our imaginations into their own positions, come closer to a true picture of the situation…

…there is a danger that we may [Arendt argued], in the very exchange of perspectives, be speaking not at all to each other, that is, to specific interlocutors whose perspectives—and ultimately whose agreement—we desire (and thus whose disagreement we must tolerate), but rather to an abstract universal media pseudo-conversation, to the empty signifier of an invisible authority [and] one must admit that this peril characterizes our idle chatter on Twitter no less than the talk at cocktail parties and banal book reviews Arendt lamented in her day. In that sense it is not necessarily such a disaster if, for the moment, the possibility of a “national conversation” in media and politics seems to be suspended. Indeed, the whole point of Arendt and Denneny’s insight is to remind us that if we are to learn again how to speak to each other (and not merely speak in each other’s—perhaps merely virtual—presence), then participation in the life of real, concrete, internally diverse groups will be our classrooms…   

Hannah Arendt, Michael Denneny, and the real value of diversity: “Living in Arendt’s World.”

* Hannah Arendt

###

As we explore empathy, we might recall that it was on this date in 2015 that Cecilia Bleasdale sent her daughter Grace photo of a dress she intended to wear to Grace’s wedding. Celia thought that the dress,  blue with black lace, would be perfect; but her daughter saw a white dress with gold lace. Grace posted the photo to Facebook, and the debate– blue/black or white/gold– broadened.

Then a friend uploaded it to Tumblr… and the argument went global. That post saw up to 840,000 views per minute. The next day, the retailer, Roman Originals (which confirmed that the dress was, in fact, blue and black), sold out of the model within 30 minutes.

It spread further. Celebrities posted and reposted, tweeted and retweeted (e.g., Taylor Swift, who saw blue and black and said she was “confused and scared,” was retweeted 111,134 times and liked 154,188 times); morning news shows covered the controversy…

Science has yet adequately to explain the phenomenon.

source

“The pursuit of science is a grand adventure, driven by curiosity, fueled by passion, and guided by reason”*…

Adam Mastroianni on how science advances (and how it’s held back), with a provocative set of suggestions for how it might be accelerated…

There are two kinds of problems in the world: strong-link problems and weak-link problems.

Weak-link problems are problems where the overall quality depends on how good the worst stuff is. You fix weak-link problems by making the weakest links stronger, or by eliminating them entirely.

Food safety, for example, is a weak-link problem. You don’t want to eat anything that will kill you. That’s why it makes sense for the Food and Drug Administration to inspect processing plants, to set standards, and to ban dangerous foods…

Weak-link problems are everywhere. A car engine is a weak-link problem: it doesn’t matter how great your spark plugs are if your transmission is busted. Nuclear proliferation is a weak-link problem: it would be great if, say, France locked up their nukes even tighter, but the real danger is some rogue nation blowing up the world. Putting on too-tight pants is a weak-link problem: they’re gonna split at the seams.

It’s easy to assume that all problems are like this, but they’re not. Some problems are strong-link problems: overall quality depends on how good the best stuff is, and the bad stuff barely matters. Like music, for instance. You listen to the stuff you like the most and ignore the rest. When your favorite band releases a new album, you go “yippee!” When a band you’ve never heard of and wouldn’t like anyway releases a new album, you go…nothing at all, you don’t even know it’s happened. At worst, bad music makes it a little harder for you to find good music, or it annoys you by being played on the radio in the grocery store while you’re trying to buy your beetle-free asparagus…

Strong-link problems are everywhere; they’re just harder to spot. Winning the Olympics is a strong-link problem: all that matters is how good your country’s best athletes are. Friendships are a strong-link problem: you wouldn’t trade your ride-or-dies for better acquaintances. Venture capital is a strong-link problem: it’s fine to invest in a bunch of startups that go bust as long as one of them goes to a billion…

In the long run, the best stuff is basically all that matters, and the bad stuff doesn’t matter at all. The history of science is littered with the skulls of dead theories. No more phlogiston nor phlegm, no more luminiferous ether, no more geocentrism, no more measuring someone’s character by the bumps on their head, no more barnacles magically turning into geese, no more invisible rays shooting out of people’s eyes, no more plum pudding

Our current scientific beliefs are not a random mix of the dumbest and smartest ideas from all of human history, and that’s because the smarter ideas stuck around while the dumber ones kind of went nowhere, on average—the hallmark of a strong-link problem. That doesn’t mean better ideas win immediately. Worse ideas can soak up resources and waste our time, and frauds can mislead us temporarily. It can take longer than a human lifetime to figure out which ideas are better, and sometimes progress only happens when old scientists die. But when a theory does a better job of explaining the world, it tends to stick around.

(Science being a strong-link problem doesn’t mean that science is currently strong. I think we’re still living in the Dark Ages, just less dark than before.)

Here’s the crazy thing: most people treat science like it’s a weak-link problem.

Peer reviewing publications and grant proposals, for example, is a massive weak-link intervention. We spend ~15,000 collective years of effort every year trying to prevent bad research from being published. We force scientists to spend huge chunks of time filling out grant applications—most of which will be unsuccessful—because we want to make sure we aren’t wasting our money…

I think there are two reasons why scientists act like science is a weak-link problem.

The first reason is fear. Competition for academic jobs, grants, and space in prestigious journals is more cutthroat than ever. When a single member of a grant panel, hiring committee, or editorial board can tank your career, you better stick to low-risk ideas. That’s fine when we’re trying to keep beetles out of asparagus, but it’s not fine when we’re trying to discover fundamental truths about the world…

The second reason is status. I’ve talked to a lot of folks since I published The rise and fall of peer review and got a lot of comments, and I’ve realized that when scientists tell me, “We need to prevent bad research from being published!” they often mean, “We need to prevent people from gaining academic status that they don’t deserve!” That is, to them, the problem with bad research isn’t really that it distorts the scientific record. The problem with bad research is that it’s cheating

I get that. It’s maddening to watch someone get ahead using shady tactics, and it might seem like the solution is to tighten the rules so we catch more of the cheaters. But that’s weak-link thinking. The real solution is to care less about the hierarchy

Here’s our reward for a generation of weak-link thinking.

The US government spends ~10x more on science today than it did in 1956, adjusted for inflation. We’ve got loads more scientists, and they publish way more papers. And yet science is less disruptive than ever, scientific productivity has been falling for decades, and scientists rate the discoveries of decades ago as worthier than the discoveries of today. (Reminder, if you want to blame this on ideas getting harder to find, I will fight you.)…

Whether we realize it or not, we’re always making calls like this. Whenever we demand certificates, credentials, inspections, professionalism, standards, and regulations, we are saying: “this is a weak-link problem; we must prevent the bad!”

Whenever we demand laissez-faire, the cutting of red tape, the letting of a thousand flowers bloom, we are saying: “this is a strong-link problem; we must promote the good!”

When we get this right, we fill the world with good things and rid the world of bad things. When we don’t, we end up stunting science for a generation. Or we end up eating a lot of asparagus beetles…

Science is a strong-link problem,” from @a_m_mastroianni in @science_seeds.

* James Clerk Maxwell

###

As we ponder the process of progress, we might spare a thought for Sir Christopher Wren; he died on this date in 1723.  A mathematician and astronomer (who co-founded and later served as president of the Royal Society), he is better remembered as one of the most highly acclaimed English architects in history; he was given responsibility for rebuilding 52 churches in the City of London after the Great Fire in 1666, including what is regarded as his masterpiece, St. Paul’s Cathedral, on Ludgate Hill.

Wren, whose scientific work ranged broadly– e.g., he invented a “weather clock” similar to a modern barometer, new engraving methods, and helped develop a blood transfusion technique– was admired by Isaac Newton, as Newton noted in the Principia.

 source