(Roughly) Daily

Posts Tagged ‘semiconductors

“In order for the United States to do the right things for the long term, it appears to be helpful for us to have the prospect of humiliation. Sputnik helped us fund good science – really good science: the semiconductor came out of it.”*…

Now the question is the semiconductor itself… and as Arthur Goldhammer explains in his review of Chris Miller‘s important new book Chip War, the answer may not be as clear as many suggest…

In left-liberal circles there is a rough consensus about what has gone wrong with our politics over the past 40 years. The critique can be summed up in two words: neoliberalism and globalization. Although these capacious ideological generalizations cover a multitude of sins, the gravamen of the charge against both is that, in the name of economic efficiency and growth, globalizing neoliberals of both the right and the left justified depriving national governments of the power to reduce inequalities of wealth and income, promote equal opportunity, and protect the health and welfare of the citizenry. Neoliberals prioritized property rights over social and political rights and protected markets from political meddling. They removed regulatory fetters on the movement of capital and sought the cheapest labor they could find to put their money to work. As a result, from the late 1970s on, governments across the developed world retreated from the social democratic reforms credited with fostering the harmonious prosperity of the three decades following World War II—the period the French have dubbed les Trente Glorieuses—thereby triggering a populist and xenophobic backlash while polarizing previously consensual political systems and weakening resistance to authoritarian demagogues.

This account of political change across the Western world since the 1980s has much to recommend it, not least the implication that the globalized neoliberal regime has sown the seeds of its own impending demise. This is the view espoused in one form or another by a number of excellent recent books, among them Gary Gerstle’s The Rise and Fall of the Neoliberal Order, Michael Tomasky’s The Middle Out, and Bradford DeLong’s Slouching Towards Utopia. Yet each of these estimable authors embraces the notion that the novel feature of the period was superstructural, to borrow a term of art from the Marxist lexicon: All believe that ideology was in the driver’s seat and that it was the readiness of left-liberals to accede to the tenets of market-first ideology that established neoliberalism as the unsurpassable political horizon of the age (to borrow a phrase from philosopher Jean-Paul Sartre).

But what if this superstructural interpretation is incomplete? What if it blinds us to a deeper transformation of the means of production themselves? What if the key innovation of the 1970s and ’80s was the advent not of neoliberal ideology but of the microprocessor, which simultaneously created new markets, dramatically altered trade flows, and shifted both the economic and military balance of power among nations? And what if this crucial technological innovation can trace its roots all the way back to the aforementioned Trente Glorieuses? What if the glory years of social democracy saw the benefits of higher education spread much more widely than ever before, disseminating technological skills throughout the world and making it possible to tap far more of humanity’s collective brainpower, while creating a web of interdependent corporations spanning both the developed and less developed worlds? The microprocessor not only became the flagship product of the neoliberal era’s dominant industry but also served as its indispensable instrument, without which it would have been impossible to tame the torrents of information necessary to manage far-flung supply chains and global capital flows.

Chris Miller’s Chip War deserves credit precisely for redirecting our attention from superstructure to base, from the high political drama of the past four decades to the more prosaic business of manufacturing microchips. At its most basic level, the book offers a masterful history of the semiconductor industry, from the invention of the first transistor in 1947 to the incredibly complex machinery required to deposit tens of billions of nearly atom-sized switches on a silicon chip no larger than a fingernail. Miller, who teaches international history at Tufts University’s Fletcher School, emphasizes the national security implications of a global supply chain in which components crucial to U.S. defense must pass through choke points such as Taiwan subject to intervention by commercial and strategic rivals. But the history he recounts in vivid detail also tells a more hopeful story, illustrating the way in which globalization has made it possible to mobilize humanity’s collective brainpower to achieve progress that no single country could have achieved on its own.

In assessing the national security risks posed by China’s semiconductor ambitions, some analysts seem to have accepted Andy Grove’s adage that “only the paranoid survive” at face value. While one former UK intelligence official argued that “we should accept that China will be a global tech power in the future and start managing the risk,” the United States, taking a darker view of China’s aims, has set out to stop China in its tracks by pressuring allies to reject Huawei chips and by banning the export of certain U.S.-developed technologies to China, most notably with the CHIPS Act of 2022 and related legislation.

Such aggressive policies could backfire, however. Miller quotes China tech policy analyst Dan Wang, who argues that American restrictions have “boosted Beijing’s quest for tech dominance” by catalyzing new Chinese government policies that support their local chip industry, including the training of tens of thousands of electrical engineers and condensed matter physicists. There are good reasons to worry about China’s military ambitions, but it is probably futile to try to halt the spread of technology as though it were a bulk good susceptible to blockade. There are also less aggressive ways to alleviate Chinese threats to the global supply chain: For instance, U.S. incentives have encouraged TSMC to move some of its operations from Taiwan to Arizona.

Finally, history shows that trying to stymie competitors by impeding the flow of technical information is unlikely to work against an adversary like China, with a large pool of educated workers and substantial ability to invest in research and development. Remember that Britain tried to monopolize early nineteenth-century textile technology, but Samuel Slater, the “father of the American Industrial Revolution,” used his knowledge of British machine designs to develop better technology in his adopted country. The way to compete effectively with China is not to ratchet up bellicose rhetoric about defending Taiwan or attempt to halt the spread of technical know-how by drafting new CHIP Acts, but to educate American workers and foster closer cooperation with other countries that have taken the lead in developing key aspects of the semiconductor manufacturing process. The history that Miller recounts demonstrates that what matters most in achieving technological leadership is free movement of people and ideas, not tariffs, export controls, or paranoid levels of fear. The best counterweight to Chinese military and commercial ambitions is the collective brainpower of the democratic world, not chip embargoes and saber-rattling…

The United States wants to stop China’s semiconductor industry in its tracks. Here’s how that could backfire: “Chip Shots,” from @artgoldhammer in @DemJournal. Eminently worth reading in full.

See also: “No, I Do Not Think the Microprocessor Doomed Social Democracy,” an elaboration on and response to Goldhammer from Brad DeLong (@delong).

* Bill Gates


As we ponder policy, we might recall that it was on this date in 1980 that Microsoft launched its first hardware product, the Z-80 SoftCard.

The brainchild of Paul Allen, the SoftCard was a microprocessor that plugged into the Apple II personal computer, allowing it to run programs written for the CP/M operating system. CP/M was a very popular OS for early personal computers, one for which much software was written. Indeed, the word processor WordStar was so popular that users purchased the SoftCard and a companion “80-column card” just to run it on the Apple II. At one point, the SoftCard product brought in about half of Microsoft’s total revenue. It was discontinued in 1986 as CP/M’s popularity waned in the face of competition from Microsoft’s own MS-DOS (and the growing popularity of Microsoft’s Word and Excel applications).


“Visualization gives you answers to questions you didn’t know you had”*…

Reckoning before writing: Mesopotamian Clay Tokens

Physical representations of data have existed for thousands of years. The List of Physical Visualizations (and the accompanying Gallery) collect illustrative examples, e.g…

5500 BC – Mesopotamian Clay Tokens

The earliest data visualizations were likely physical: built by arranging stones or pebbles, and later, clay tokens. According to an eminent archaeologist (Schmandt-Besserat, 1999):

“Whereas words consist of immaterial sounds, the tokens were concrete, solid, tangible artifacts, which could be handled, arranged and rearranged at will. For instance, the tokens could be ordered in special columns according to types of merchandise, entries and expenditures; donors or recipients. The token system thus encouraged manipulating data by abstracting all possible variables. (Harth 1983. 19) […] No doubt patterning, the presentation of data in a particular configuration, was developed to highlight special items (Luria 1976. 20).”

Clay tokens suggest that physical objects were used to externalize information, support visual thinking and enhance cognition way before paper and writing were invented…

There are 370 entries (so far). Browse them at List of Physical Visualizations (@dataphys)

Ben Schneiderman


As we celebrate the concrete, we might carefully-calculated birthday greetings to Rolf Landauer; he was born on this date in 1927. A physicist, he made a number important contributions in a range of areas: the thermodynamics of information processing, condensed matter physics, and the conductivity of disordered media.

He is probably best remembered for “Landauer’s Principle,” which described the energy used during a computer’s operation. Whenever the machine is resetting for another computation, bits are flushed from the computer’s memory, and in that electronic operation, a certain amount of energy is lost (a simple logical consequence of the second law of thermodynamics). Thus, when information is erased, there is an inevitable “thermodynamic cost of forgetting,” which governs the development of more energy-efficient computers. The maximum entropy of a bounded physical system is finite– so while most engineers dealt with practical limitations of compacting ever more circuitry onto tiny chips, Landauer considered the theoretical limit: if technology improved indefinitely, how soon will it run into the insuperable barriers set by nature?

A so-called logically reversible computation, in which no information is erased, may in principle be carried out without releasing any heat. This has led to considerable interest in the study of reversible computing. Indeed, without reversible computing, increases in the number of computations per joule of energy dissipated must eventually come to a halt. If Koomey‘s law continues to hold, the limit implied by Landauer’s principle would be reached around the year 2050.


%d bloggers like this: