Archive for February 2024
“We couldn’t build quantum computers unless the universe were quantum and computing… We’re hacking into the universe.”*…
… in the process of which, as Ben Brubaker explains, we learn some fascinating things…
If you want to tile a bathroom floor, square tiles are the simplest option — they fit together without any gaps in a grid pattern that can continue indefinitely. That square grid has a property shared by many other tilings: Shift the whole grid over by a fixed amount, and the resulting pattern is indistinguishable from the original. But to many mathematicians, such “periodic” tilings are boring. If you’ve seen one small patch, you’ve seen it all.
In the 1960s, mathematicians began to study “aperiodic” tile sets with far richer behavior. Perhaps the most famous is a pair of diamond-shaped tiles discovered in the 1970s by the polymathic physicist and future Nobel laureate Roger Penrose. Copies of these two tiles can form infinitely many different patterns that go on forever, called Penrose tilings. Yet no matter how you arrange the tiles, you’ll never get a periodic repeating pattern.
“These are tilings that shouldn’t really exist,” said Nikolas Breuckmann, a physicist at the University of Bristol.
For over half a century, aperiodic tilings have fascinated mathematicians, hobbyists and researchers in many other fields. Now, two physicists have discovered a connection between aperiodic tilings and a seemingly unrelated branch of computer science: the study of how future quantum computers can encode information to shield it from errors. In a paper posted to the preprint server arxiv.org in November, the researchers showed how to transform Penrose tilings into an entirely new type of quantum error-correcting code. They also constructed similar codes based on two other kinds of aperiodic tiling.
At the heart of the correspondence is a simple observation: In both aperiodic tilings and quantum error-correcting codes, learning about a small part of a large system reveals nothing about the system as a whole…
Fascinating: “Never-Repeating Tiles Can Safeguard Quantum Information,” from @benbenbrubaker in @QuantaMagazine.
Plus- bonus background on tiling.
* “We couldn’t build quantum computers unless the universe were quantum and computing. We can build such machines because the universe is storing and processing information in the quantum realm. When we build quantum computers, we’re hijacking that underlying computation in order to make it do things we want: little and/or/not calculations. We’re hacking into the universe.” –Seth Lloyd
###
As we care for qubits, we might send carefully-calculated birthday greetings to Herman Hollerith; he was born on this date in 1860. A statistician and inventor, he was a seminal figure in the development of data processing: he invented (for the 1890 U.S. Census) an electromechanical tabulating machine for punched cards to assist in summarizing information (and, later, for use in accounting). His invention of the punched card tabulating machine, which he patented in 1884, marked the beginning of the era of mechanized binary code and semiautomatic data processing systems– and his approach dominated that landscape for nearly a century.
The company that Hollerith founded to exploit his invention was merged in 1911 with several other companies to form the Computing-Tabulating-Recording Company. In 1924, the company was renamed “International Business Machines” (or, as we know it, IBM).
“The clustering of technological innovation in time and space helps explain both the uneven growth among nations and the rise and decline of hegemonic powers”*…
As scholars like Robert Gordon and Tyler Cowan have begun to call out a slowing of progress and growth in the U.S., others are beginning to wonder if “innovation clusters” like Silicon Valley are still advantageous. For example, Brian J. Asquith…
In 2011, the economist Tyler Cowen published The Great Stagnation, a short treatise with a provocative hypothesis. Cowen challenged his audience to look beyond the gleam of the internet and personal computing, arguing that these innovations masked a more troubling reality. Cowen contended that, since the 1970s, there has been a marked stagnation in critical economic indicators: median family income, total factor productivity growth, and average annual GDP growth have all plateaued…
In the years since the publication of the Great Stagnation hypothesis, others have stepped forward to offer support for this theory. Robert Gordon’s 2017 The Rise and Fall of American Growth chronicles in engrossing detail the beginnings of the Second Industrial Revolution in the United States, starting around 1870, the acceleration of growth spanning the 1920–70 period, and then a general slowdown and stagnation since about 1970. Gordon’s key finding is that, while the growth rate of average total factor productivity from 1920 to 1970 was 1.9 percent, it was just 0.6 percent from 1970 to 2014, where 1970 represents a secular trend break for reasons still not entirely understood. Cowen’s and Gordon’s insights have since been further corroborated by numerous research papers. Research productivity across a variety of measures (researchers per paper, R&D spending needed to maintain existing growth rates, etc.) has been on the decline across the developed world. Languishing productivity growth extends beyond research-intensive industries. In sectors such as construction, the value added per worker was 40 percent lower in 2020 than it was in 1970. The trend is mirrored in firm productivity growth, where a small number of superstar firms see exceptionally strong growth and the rest of the distribution increasingly lags behind.
A 2020 article by Nicholas Bloom and three coauthors in the American Economic Review cut right to the chase by asking, “Are Ideas Getting Harder to Find?,” and answered its own question in the affirmative.6 Depending on the data source, the authors find that while the number of researchers has grown sharply, output per researcher has declined sharply, leading aggregate research productivity to decline by 5 percent per year.
This stagnation should elicit greater surprise and concern because it persists despite advanced economies adhering to the established economics prescription intended to boost growth and innovation rates: (1) promote mass higher education, (2) identify particularly bright young people via standardized testing and direct them to research‑intensive universities, and (3) pipe basic research grants through the university system to foster locally-driven research and development networks that supercharge productivity…
…
… the tech cluster phenomenon stands out because there is a fundamental discrepancy between how the clusters function in practice versus their theoretical contributions to greater growth rates. The emergence of tech clusters has been celebrated by many leading economists because of a range of findings that innovative people become more productive (by various metrics) when they work in the same location as other talented people in the same field. In this telling, the essence of innovation can be boiled down to three things: co-location, co-location, co-location. No other urban form seems to facilitate innovation like a cluster of interconnected researchers and firms.
This line of reasoning yields a straightforward syllogism: technology clusters enhance individual innovation and productivity. The local nature of innovation notwithstanding, technologies developed within these clusters can be adopted and enjoyed globally. Thus, while not everyone can live in a tech cluster, individuals worldwide benefit from new advances and innovations generated there, and some of the outsized economic gains the clusters produce can then be redistributed to people outside of the clusters to smooth over any lingering inequalities. Therefore, any policy that weakens these tech clusters leads to a diminished rate of innovation and leaves humanity as a whole poorer.
Yet the fact that the emergence of the tech clusters has also coincided with Cowen’s Great Stagnation raises certain questions. Are there shortcomings in the empirical evidence on the effects of the tech clusters? Does technology really diffuse across the rest of the economy as many economists assume? Do the tech clusters inherently prioritize welfare-enhancing technologies? Is there some role for federal or state action to improve the situation? Clusters are not unique to the postwar period: Detroit famously achieved a large agglomeration economy based on automobiles in the early twentieth century, and several authors have drawn parallels between the ascents of Detroit and Silicon Valley. What makes today’s tech clusters distinct from past ones? The fact that the tech clusters have not yielded the same society-enhancing benefits that they once promised should invite further scrutiny…
How could this be? What can we do about it? Eminently worth reading in full: “Superstars or Black Holes: Are Tech Clusters Causing Stagnation?” (possible soft paywall), from @basquith827.
See also: Brad DeLong, on comments from Eric Schmidt: “That an externality market failure is partly counterbalanced and offset by a behavioral-irrationality-herd-mania cognitive failure is a fact about the world. But it does not mean that we should not be thinking and working very hard to build a better system—or that those who profit mightily from herd mania on the part of others should feel good about themselves.”
###
As we contemplate co-location, we might recall that it was on this date in 1956 that a denizen of one of America’s leading tech/innovation hubs, Jay Forrester at MIT [see here and here], was awarded a patent for his coincident current magnetic core memory (Patent No. 2,736,880). Forrester’s invention, a “multicoordinate digital information storage device,” became the standard memory device for digital computers until supplanted by solid state (semiconductor) RAM in the mid-1970s.
“Conceal me what I am, and be my aid for such disguise as haply shall become the form of my intent”*…
Jurisdictional triage…
The website for the shipping registry of Eswatini (formerly Swaziland), established in October 2023, appears much like those of more established seafaring nations. A picture of vast cruise ships sits alongside promises of the “highest quality ship maritime services and ship registrations”. Delve deeper though and Eswatini’s nautical credentials start to unravel. For one thing, the African country is landlocked, calling into doubt the assertion that the port of Mbabane, Eswatini’s capital, is situated on the coast of South Africa. It is a “dry port”, 150km from the sea and 30km from a rail link to Maputo on Mozambique’s Indian Ocean coast. Its stated ability to handle “containers, bulk carriers and tankers” seems questionable.
The country is following in the wake of other smaller nations that offer their flag to shipowners. Seagoing vessels are obliged by maritime law to fly a flag of a country of registration and stateless vessels are not protected by international law. Yet the days when the stern of a ship would fly a national flag connected to the ownership of the vessel are long past. Liberia, Panama and the Marshall Islands now account for nearly half of the global fleet, by tonnage. Countries with loose ties to seafaring have been dubbed “flags of convenience” for levying low or no taxes and offering an escape from burdensome labour laws and other regulatory requirements. Often administered by private companies based elsewhere, these registries are a handy source of additional revenue for small and poor countries.
Registering a merchant vessel with a jurisdiction that is a mere speck on the map is not necessarily a cause for concern. Many take seriously their responsibility to oversee adherence to the rules and regulations of the high seas. Liberia’s, based near Washington, dc, has a good record of maintaining global standards across its fleet. Other registries merely give a “façade of legal oversight” says Richard Meade, editor of Lloyd’s List Intelligence, a trade publication. A blacklist complied by Paris mou, an organisation that aims to “eliminate the operation of substandard ships”, puts the likes of Cameroon, Vanuatu and Comoros near the bottom…
…
Less diligent registries are helping to fuel the growth of a “dark fleet”—some 1,400 vessels, according to the Atlantic Council, a think-tank—that operates with little regulatory oversight. They are mostly oil tankers that engage in subterfuge to hide where they are and the origin of their cargo in order to evade sanctions on Russian crude oil. Ownership is often opaque. Mr Meade estimates that 12% of the global tanker fleet is now dark. He notes that Gabon’s registry, now comprising 140 vessels, is the fastest-growing in the world thanks largely to the reflagging of Russian tankers.
An expanding dark fleet poses a danger to itself and other vessels. Dark ships tend to be old and less well maintained, and some may be uninsured. Practices such as turning off or “spoofing” location devices are a danger to other ships. Swapping oil cargoes at sea to obscure their origins poses the danger of a spillage. Mr Meade foresees a worse calamity of a large “dark fleet” tanker sinking in an environmentally sensitive area, with no accountability…
Sea-going chicanery: “Why does landlocked Eswatini have a ship registry?” (gift article) from @TheEconomist.
* Shakespeare, Twelfth Night
###
As we deconstruct disguises, we might recall that it was on this date in 1617 that Sweden and the Tsardom of Russia signed the Treaty of Stolbovo, ending the Ingrian War and shutting Russia out of the Baltic Sea… until 1703, when Peter the Great won back access in battle with the Swedes– a victory he cemented by founding St. Petersburg.












You must be logged in to post a comment.