Posts Tagged ‘semiconductors’
“Give credit where credit is due”*…
In the early 19th century, a young woman revolutionized the lumber business…
As a young woman, Tabitha Babbitt was a weaver in Harvard, Massachusetts. She used to watch the workers at the local sawmill. Observing them use the difficult two-man whipsaw, she noticed that half of their motion was wasted. It had two handles which two men would pull from side to side. However, the saw only cut the wood when it was being pulled forward. This meant the second or reverse pull was fairly useless other than to get the saw back to starting position which was a waste of energy. Tabitha proposed creating a round blade to increase efficiency. Eventually she came up with a prototype, attaching a circular blade to her spinning wheel, using the pedal of her wheel to power it. As the blade spun, no movement was wasted. The circular saw was connected to a water-powered machine to reduce the effort to cut lumber, meaning that wood could be cut faster with half the manpower. The first circular saw she allegedly made is in Albany, New York State USA. A larger version of her design was later installed in the sawmill.
But – Tabitha was a member of the Shakers, a Christian sect founded circa 1747 in England who had emigrated and settled in revolutionary colonial America. Their core beliefs centred round a perfect society, created through communal living, gender and racial equality, pacifism, confession of sin, celibacy and separation from the world. As such, they valued hard manual work, a simple lifestyle, and thrived on the forestry industry.
However, their beliefs prohibited any member applying for patents as they believed intellectual properties should be shared by the community with no restriction. Because she did not patent it (and according to wiki the reference to her invention exists only in Shaker lore), there is controversy over whether she was the first true inventor of the circular saw.
Two French men patented the circular saw in USA after reading about her saw in Shaker papers. One of the patentees, Stephen Miller argues she wasn’t the first inventor based on the date she joined the sect. He contended that it was invented at Mount Lebanon Shaker Village by Amos Bishop or Benjamin Bruce in 1793 – or not by a Shaker at all.
Samuel Miller obtained a patent in UK for a saw windmill which supposedly used a form of a circular saw in 1777 though the type of saw is only mentioned in passing, making it seem as though it was not his invention. Walter Taylor a few years later in same area of the United Kingdom seemed to have types of circular saws at his sawmill but in fact he only ever received patents for improvements to blockmaking.
However, it appears Babbitt’s circular saw design was much larger than other circular saw mechanisms and enough modifications were made to differentiate her invention from the rest. Her basic design was also the one that soon was copied at various American sawmills, popularising the use of circular saw in mills.
Tabitha was also credited with improving the spinning wheel head, inventing a process to manufacture false teeth, and inventing a process for manufacturing the then semi-revolutionary type of nail known as “cut nails” which replaced forged nails, a claim to fame she shares with a few other inventors including famed inventor Eli Whitney…
An unsung hero: “Tabitha Babbitt” from the Mills Archive, via Mathew Ingram‘s When the Going Gets Weird (also source of the image above).
* Attributed to Samuel Adams, who used the phrase in a late 18th century letter
###
As we investigate innovation, we might recall that it was on this date in 1959 that Jack Kilby of Texas Instruments filed the first patent for an integrated circuit (U.S. Patent 3,138,743). In mid-1958, as a newly employed engineer at Texas Instruments, Kilby didn’t yet have the right to a summer vacation. So he spent the summer working on the problem in circuit design known as the “tyranny of numbers” (how to add more and more components, all soldered to all of the others, to improve performance). He finally came to the conclusion that manufacturing the circuit components en masse in a single piece of semiconductor material could provide a solution. On September 12, he presented his findings to the management: a piece of germanium with an oscilloscope attached. Kilby pressed a switch, and the oscilloscope showed a continuous sine wave– proving that his integrated circuit worked and thus that he had solved the problem.
Kilby is generally credited as co-inventor of the integrated circuit, along with Robert Noyce (who independently made a similar circuit a few months later). Kilby has been honored in many ways for his breakthrough, probably most augustly with the 2000 Nobel Prize in Physics.

“It’s peculiar. It’s special. There’s very little of it, but it has this pivotal role in the universe.”*…
One of the oldest, scarcest elements in the universe has given us treatments for mental illness, ovenproof casserole dishes, and electric cars. Increasingly, our response to climate change seems to depend on it. But how much do we really know about lithium? Jacob Baynham explains…
The universe was born small, unimaginably dense and furiously hot. At first, it was all energy contained in a volume of space that exploded in size by a factor of 100 septillion in a fraction of a second. Imagine it as a single cell ballooning to the size of the Milky Way almost instantaneously. Elementary particles like quarks, photons and electrons were smashing into each other with such violence that no other matter could exist. The primordial cosmos was a white-hot smoothie in a blender.
One second after the Big Bang, the expanding universe was 10 billion degrees Kelvin. Quarks and gluons had congealed to make the first protons and neutrons, which collided over the course of a few minutes and stuck in different configurations, forming the nuclei of the first three elements: two gases and one light metal. For the next 100 million years or so, these would be the only elements in the vast, unblemished fabric of space before the first stars ignited like furnaces in the dark to forge all other matter.
Almost 14 billion years later, on the third rocky planet orbiting a young star in a distal arm of a spiral galaxy, intelligent lifeforms would give names to those first three elements. The two gases: hydrogen and helium. The metal: lithium.
This is the story of that metal, a powerful, promising and somehow still mysterious element on which those intelligent lifeforms — still alone in the universe, as far as they know — have pinned their hopes for survival on a planet warmed by their excesses…
[Baynham tells the story of this remarkable element, the development of it many uses (in psychopharmacology, in materials science, and of course in electronics– especially batteries), the rigors of extracting it for those purposes, and the challenges that its scarcity– and its potency– present…]
… Long before cell phones and climate anxiety and the Tesla Model Y, long before dinosaurs and the first creatures that climbed out of the ocean to walk on land, long before the Earth formed from swirling masses of cosmic matter heavy enough to coalesce, back, way back, to the infant universe, to the dawn of matter itself, there were just three types of atoms — three elements in the blank canvas of space. One of them was lithium. It was light, fragile and extremely reactive, its one outer electron tenuously held in place.
Everything we have done with lithium, all its wondrous applications in energy, industry and psychiatry, somehow hinges on this basic structure, a sort of magic around which we’re increasingly engineering our future. Lightness is usually associated with abundance on the periodic table — almost 99% of the mass of the universe is just the lightest two elements. Lithium, however, is the third lightest element and still mysteriously scarce…
That most elemental of elements: “The Secret, Magical Life of Lithium,” from @JacobBaynham in @noemamag.com.
###
As we muse on materials, we might send densely-packed birthday greetings to Philip W. Anderson; he was born on this date in 1923. A theoretical physicist, he shared (with John H. Van Vleck and Sir Nevill F. Mott) the 1977 Nobel Prize for Physics for his research on semiconductors, superconductivity, and magnetism. Anderson made contributions to the theories of localization, antiferromagnetism, symmetry breaking including a paper in 1962 discussing symmetry breaking in particle physics, leading to the development of the Standard Model around 10 years later), and high-temperature superconductivity, and to the philosophy of science through his writings on emergent phenomena. He was a pioneer in the field that he named: condensed matter physics, which has found applications in semiconductor and laserr technology, magnetic storage, liquid crystals, optical fibers, nanotechnology, quantum computing, and biomedicine.
“We have the best government that money can buy”*…

Most Americans agree that the prevalence of big money in politics is a problem. But sometimes it can be hard to see the (tallest) trees for the forest. The estimable Molly White…
Did you know that the cryptocurrency industry has spent more on 2024 elections in the United States than the oil industry? More than the pharmaceutical industry?
In fact, the cryptocurrency industry has spent more on 2024 elections than [either] the entire energy sector [or] the entire health sector. Those industries, both worth hundreds of billions or trillions of dollars, are being outspent by an industry that, even by generous estimates, is worth less than $20 billion.**
Most of the cryptocurrency industry’s money is going to massive super PACs like Fairshake — that is, single-issue committees focused only on installing crypto-friendly politicians and ousting those the industry views as a threat.
Although these PACs have spent only a fraction of the more than $200 million in their combined war chests, they’re already finding some success. Democratic California Senate candidate Katie Porter lost her primary race after Fairshake spent $10 million on attack ads against her. In New York, Fairshake piled on $2 million to oppose Democratic House Candidate Jamaal Bowman, who ultimately lost his primary race. $1.5 million from the Republican-focused blockchain super PAC, Defend American Jobs, helped Republican Jim Justice win his West Virginia Senate primary. $1.7 million from the Democrat-focused blockchain super PAC, Protect Progress, similarly aided Shomari Figures in winning his Alabama House primary.
Although election spending information is public, it can be incredibly time- and labor-intensive to comb through. The crypto industry isn’t helping to make things clearer, either, with innocuously-named PACs like “Fairshake” that obscure the goals of these committees. Although the industry likes to claim that crypto is a major election issue with grassroots support, advertisements run by these committees rarely mention cryptocurrency or blockchains at all, or even technology or finance more broadly. And some of these PACs funnel money through surrogate committees, obscuring the origins of some of the more heavily partisan spending.
Furthermore, the wealthy executives and venture capitalists associated with the industry are spending heavily as individuals, without going through these PACs or spending through their companies. Cameron and Tyler Winklevoss — founders of the Gemini cryptocurrency exchange — each donated $1 million each to Donald Trump’s presidential campaign. They were followed soon after by Jesse Powell, chairman of the Kraken cryptocurrency exchange, who pitched in another million…
** Unlike most other industries, people really like to estimate the “size” of the cryptocurrency industry by the total market cap of all cryptocurrencies (a notoriously inaccurate number). Estimates based on traditional metrics vary widely from low single-digit billions to around $20 billion, although the higher numbers are typically projections rather than historical data.
The biggest big corporate money on politics– how the cryptocurrency industry is spending to influence 2024 elections in the United States: “Follow the Crypto,” from @molly0xFFF.
You can, in fact, “follow the crypto” on White’s new site, which tracks contributions to cryptocurrency-focused super PACs like Fairshake, Defend American Jobs, and Protect Progress. As White observes:
Despite how much these PACs have already raised, the cryptocurrency industry is only ramping up their spending as elections draw nearer, and most of the money held by these PACs is still waiting to be deployed. 50% of the funds in Fairshake’s coffers — $85 million — was raised in May alone, and that’s the last month with complete data. These companies show no sign of slowing down. With this site, we will be able to follow how these industries are working to buy influence across the country…
* Mark Twain
###
As we ponder the purchase of the plebiscite, we might recall that it was on this date in 1968 that a major contributor to the technostructure supporting crypto was born when Robert Noyce and Gordon Moore incorporated Intel with $2.5 million in capital. The semiconductor company became the emergent industry’s leader, before relinquishing that title (currently held by TSMC); still, the company is currently valued at $131.32 billion.

“In order for the United States to do the right things for the long term, it appears to be helpful for us to have the prospect of humiliation. Sputnik helped us fund good science – really good science: the semiconductor came out of it.”*…
Now the question is the semiconductor itself… and as Arthur Goldhammer explains in his review of Chris Miller‘s important new book Chip War, the answer may not be as clear as many suggest…
In left-liberal circles there is a rough consensus about what has gone wrong with our politics over the past 40 years. The critique can be summed up in two words: neoliberalism and globalization. Although these capacious ideological generalizations cover a multitude of sins, the gravamen of the charge against both is that, in the name of economic efficiency and growth, globalizing neoliberals of both the right and the left justified depriving national governments of the power to reduce inequalities of wealth and income, promote equal opportunity, and protect the health and welfare of the citizenry. Neoliberals prioritized property rights over social and political rights and protected markets from political meddling. They removed regulatory fetters on the movement of capital and sought the cheapest labor they could find to put their money to work. As a result, from the late 1970s on, governments across the developed world retreated from the social democratic reforms credited with fostering the harmonious prosperity of the three decades following World War II—the period the French have dubbed les Trente Glorieuses—thereby triggering a populist and xenophobic backlash while polarizing previously consensual political systems and weakening resistance to authoritarian demagogues.
This account of political change across the Western world since the 1980s has much to recommend it, not least the implication that the globalized neoliberal regime has sown the seeds of its own impending demise. This is the view espoused in one form or another by a number of excellent recent books, among them Gary Gerstle’s The Rise and Fall of the Neoliberal Order, Michael Tomasky’s The Middle Out, and Bradford DeLong’s Slouching Towards Utopia. Yet each of these estimable authors embraces the notion that the novel feature of the period was superstructural, to borrow a term of art from the Marxist lexicon: All believe that ideology was in the driver’s seat and that it was the readiness of left-liberals to accede to the tenets of market-first ideology that established neoliberalism as the unsurpassable political horizon of the age (to borrow a phrase from philosopher Jean-Paul Sartre).
But what if this superstructural interpretation is incomplete? What if it blinds us to a deeper transformation of the means of production themselves? What if the key innovation of the 1970s and ’80s was the advent not of neoliberal ideology but of the microprocessor, which simultaneously created new markets, dramatically altered trade flows, and shifted both the economic and military balance of power among nations? And what if this crucial technological innovation can trace its roots all the way back to the aforementioned Trente Glorieuses? What if the glory years of social democracy saw the benefits of higher education spread much more widely than ever before, disseminating technological skills throughout the world and making it possible to tap far more of humanity’s collective brainpower, while creating a web of interdependent corporations spanning both the developed and less developed worlds? The microprocessor not only became the flagship product of the neoliberal era’s dominant industry but also served as its indispensable instrument, without which it would have been impossible to tame the torrents of information necessary to manage far-flung supply chains and global capital flows.
Chris Miller’s Chip War deserves credit precisely for redirecting our attention from superstructure to base, from the high political drama of the past four decades to the more prosaic business of manufacturing microchips. At its most basic level, the book offers a masterful history of the semiconductor industry, from the invention of the first transistor in 1947 to the incredibly complex machinery required to deposit tens of billions of nearly atom-sized switches on a silicon chip no larger than a fingernail. Miller, who teaches international history at Tufts University’s Fletcher School, emphasizes the national security implications of a global supply chain in which components crucial to U.S. defense must pass through choke points such as Taiwan subject to intervention by commercial and strategic rivals. But the history he recounts in vivid detail also tells a more hopeful story, illustrating the way in which globalization has made it possible to mobilize humanity’s collective brainpower to achieve progress that no single country could have achieved on its own.
…
In assessing the national security risks posed by China’s semiconductor ambitions, some analysts seem to have accepted Andy Grove’s adage that “only the paranoid survive” at face value. While one former UK intelligence official argued that “we should accept that China will be a global tech power in the future and start managing the risk,” the United States, taking a darker view of China’s aims, has set out to stop China in its tracks by pressuring allies to reject Huawei chips and by banning the export of certain U.S.-developed technologies to China, most notably with the CHIPS Act of 2022 and related legislation.
Such aggressive policies could backfire, however. Miller quotes China tech policy analyst Dan Wang, who argues that American restrictions have “boosted Beijing’s quest for tech dominance” by catalyzing new Chinese government policies that support their local chip industry, including the training of tens of thousands of electrical engineers and condensed matter physicists. There are good reasons to worry about China’s military ambitions, but it is probably futile to try to halt the spread of technology as though it were a bulk good susceptible to blockade. There are also less aggressive ways to alleviate Chinese threats to the global supply chain: For instance, U.S. incentives have encouraged TSMC to move some of its operations from Taiwan to Arizona.
Finally, history shows that trying to stymie competitors by impeding the flow of technical information is unlikely to work against an adversary like China, with a large pool of educated workers and substantial ability to invest in research and development. Remember that Britain tried to monopolize early nineteenth-century textile technology, but Samuel Slater, the “father of the American Industrial Revolution,” used his knowledge of British machine designs to develop better technology in his adopted country. The way to compete effectively with China is not to ratchet up bellicose rhetoric about defending Taiwan or attempt to halt the spread of technical know-how by drafting new CHIP Acts, but to educate American workers and foster closer cooperation with other countries that have taken the lead in developing key aspects of the semiconductor manufacturing process. The history that Miller recounts demonstrates that what matters most in achieving technological leadership is free movement of people and ideas, not tariffs, export controls, or paranoid levels of fear. The best counterweight to Chinese military and commercial ambitions is the collective brainpower of the democratic world, not chip embargoes and saber-rattling…
The United States wants to stop China’s semiconductor industry in its tracks. Here’s how that could backfire: “Chip Shots,” from @artgoldhammer in @DemJournal. Eminently worth reading in full.
See also: “No, I Do Not Think the Microprocessor Doomed Social Democracy,” an elaboration on and response to Goldhammer from Brad DeLong (@delong).
* Bill Gates
###
As we ponder policy, we might recall that it was on this date in 1980 that Microsoft launched its first hardware product, the Z-80 SoftCard.
The brainchild of Paul Allen, the SoftCard was a microprocessor that plugged into the Apple II personal computer, allowing it to run programs written for the CP/M operating system. CP/M was a very popular OS for early personal computers, one for which much software was written. Indeed, the word processor WordStar was so popular that users purchased the SoftCard and a companion “80-column card” just to run it on the Apple II. At one point, the SoftCard product brought in about half of Microsoft’s total revenue. It was discontinued in 1986 as CP/M’s popularity waned in the face of competition from Microsoft’s own MS-DOS (and the growing popularity of Microsoft’s Word and Excel applications).
“Visualization gives you answers to questions you didn’t know you had”*…
Physical representations of data have existed for thousands of years. The List of Physical Visualizations (and the accompanying Gallery) collect illustrative examples, e.g…
5500 BC – Mesopotamian Clay Tokens
The earliest data visualizations were likely physical: built by arranging stones or pebbles, and later, clay tokens. According to an eminent archaeologist (Schmandt-Besserat, 1999):
“Whereas words consist of immaterial sounds, the tokens were concrete, solid, tangible artifacts, which could be handled, arranged and rearranged at will. For instance, the tokens could be ordered in special columns according to types of merchandise, entries and expenditures; donors or recipients. The token system thus encouraged manipulating data by abstracting all possible variables. (Harth 1983. 19) […] No doubt patterning, the presentation of data in a particular configuration, was developed to highlight special items (Luria 1976. 20).”
Clay tokens suggest that physical objects were used to externalize information, support visual thinking and enhance cognition way before paper and writing were invented…
There are 370 entries (so far). Browse them at List of Physical Visualizations (@dataphys)
###
As we celebrate the concrete, we might carefully-calculated birthday greetings to Rolf Landauer; he was born on this date in 1927. A physicist, he made a number important contributions in a range of areas: the thermodynamics of information processing, condensed matter physics, and the conductivity of disordered media.
He is probably best remembered for “Landauer’s Principle,” which described the energy used during a computer’s operation. Whenever the machine is resetting for another computation, bits are flushed from the computer’s memory, and in that electronic operation, a certain amount of energy is lost (a simple logical consequence of the second law of thermodynamics). Thus, when information is erased, there is an inevitable “thermodynamic cost of forgetting,” which governs the development of more energy-efficient computers. The maximum entropy of a bounded physical system is finite– so while most engineers dealt with practical limitations of compacting ever more circuitry onto tiny chips, Landauer considered the theoretical limit: if technology improved indefinitely, how soon will it run into the insuperable barriers set by nature?
A so-called logically reversible computation, in which no information is erased, may in principle be carried out without releasing any heat. This has led to considerable interest in the study of reversible computing. Indeed, without reversible computing, increases in the number of computations per joule of energy dissipated must eventually come to a halt. If Koomey‘s law continues to hold, the limit implied by Landauer’s principle would be reached around the year 2050.







You must be logged in to post a comment.