Posts Tagged ‘development’
“Nanotechnology is an idea that most people simply didn’t believe”*…
Indeed, in the 1980s, even as nanotech pioneer Erik Drexler, a graduate student at MIT at the time, was doing the early work of defining and charting a course for the nascent field, MIT’s departments of electric engineering and computer science refused to approve his Ph.D. topic and plan of study (though ultimately the Media Lab did, and Erik earned his doctorate).
Today the reality– and centrality– of the field are only too apparent and have become the subject of trade and industrial policy… because while the U.S. led in the development of nanotech science, it lags in manufacturing and commercialization. In an excerpt from their book Industrial Policy for the United States: Winning the Competition for Good Jobs and High-Value Industries, Ian Fletcher and Marc Fasteau explain…
Nanotechnology is the manipulation of matter at scales from a fraction of a nanometer to a few hundred nanometers — sizes between individual atoms and small single-celled organisms — at which it has radically different properties. Nanotech is already significant in many industries. Integrated circuits are a form of nanotech. Other nanotech provides the light, strong composites in aircraft and space vehicles. Still other nanotech powers the solid-state lasers used to transmit information through the internet and the light-emitting diodes in LED light bulbs and flat-screen TVs. Nanotech also makes possible solar cells, the batteries in electric cars, and medical technologies such as vaccines. It is thus the unifying thread of many of today’s most advanced technologies. Unfortunately, America is falling behind.
In the future, nanotech-based quantum computing and communications will lead to more powerful computers, transforming national security and internet commerce by making currently secret communications insecure. Medical nanotechnologies will permit targeted interventions at the cellular level, providing new weapons against diseases, biological weapons, and defenses against them. China is known to be working on these.
Much of the science underpinning these advances was developed at firms and universities in the US. But the huge manufacturing industries built on it are mostly overseas. For example, the organic light-emitting diode (OLED) technology Kodak created didn’t save that firm from going bankrupt in 2012. But it did enable lucrative businesses for Korea’s Samsung, to whom Kodak licensed the technology, and LG, which bought Kodak’s entire OLED business in 2009. Today, American firms like Nanosys and Universal Display develop important nanotechnologies, but do not actually manufacture the end products and are thus relatively small.
How did the US get itself into this situation? A major government program, the National Nanotechnology Initiative (NNI), has been funded since 2001, but Washington failed to appreciate the importance of having both a technology and a manufacturing strategy. The prevailing wisdom was that if the academic science was supported, mass manufacturing would follow automatically. By contrast, successful rival nations in nanotech have focused on making these technologies manufacturable at scale, employing every policy tool from R&D subsidies to cheap capital to tariffs. A 2020 National Academies review of the NNI urged that the US recognize that ‘the recent, focused, and in some cases novel commercialization approaches of other nations may be yielding better societal outcomes.’…
A little wonky, but both fascinating and important: “Nanotechnology,” via the invaluable Delanceyplace.com.
(Image above: source)
###
As we get small, we might send miniscule birthday greetings to a man who whose work has contributed to the development of medical applications of nanotech: Bert Sakmann; he was born on this date in 1942. A cell physiologist, he shared the Nobel Prize in Physiology or Medicine (with Erwin Neher) in 1991 for their work on “the function of single ion channels in cells”– work made possible in part by their invention of the patch clamp.
“Great minds think alike”*…

Brian Potter on the (perhaps surprising) frequency with which “heroic” inventors are in fact better understood as the winners of close races…
When Alexander Graham Bell filed a patent for the telephone on February 14th, 1876, he beat competing telephone developer Elisha Gray to the patent office by just a few hours. The resulting legal dispute between Bell Telephone and Western Union (which owned the rights to Gray’s invention) would consume millions of dollars before being resolved in Bell’s favor in 1879.
Such cases of multiple invention are common, and some of the most famous and important modern inventions were invented in parallel. Both Thomas Edison and Joseph Swan patented incandescent lightbulbs in 1880. Jack Kilby and Robert Noyce patented integrated circuits in 1959. Hans von Ohain and Frank Whittle independently invented the jet engine in the 1930s. In a 1922 paper, William Ogburn and Dorothy Thomas documented 150 cases of multiple discovery in science and technology. Robert Merton found 261 examples in 1961, and observed that the phenomenon of multiple discovery was itself a multiple discovery, having been described over and over again since at least the early 19th century.
But exactly how common is multiple invention? The frequency of examples suggests that it can’t be particularly rare, but that doesn’t tell us the rate at which it occurs. In “How Common is Independent Discovery?,” Matt Clancy catalogues several attempts to estimate the frequency of multiple discovery, and tentatively comes up with a frequency of around 2-3% for simultaneous scientific discoveries, and perhaps an 8% chance that a given invention will be reinvented in the next decade. But the evidence for inventions is somewhat inconsistent, and varies greatly between studies. Clancy estimates a reinvention rate of around 8% per decade, but another study he found that looked at patent interference lawsuits between 1998 and 2014 suggests an independent invention rate of only around 0.02% per year.
The frequency of multiple invention is a useful thing to know, because it can give us clues about the nature of technological progress. A very low rate of multiple invention suggests that progress might be driven by a small number of “genius” inventors (what we might call the Great Man Theory of technological progress), and that it might be highly historically contingent (if you re-rolled the dice of history, maybe you get a totally new set of inventions and a different technological palette). A high rate of multiple invention suggests that progress is more a function of broad historical forces (that inventions appear when the conditions are right), and that progress is less contingent (if you re-rolled the dice of history, you’d get a similar progression of inventions). And if the rate of multiple invention is changing over time, perhaps the nature of technological progress is changing as well…
[Potter reviews the history and concludes that “multiple invention was extremely common”…]
… My main takeaway is that the ideas behind inventions are often in some sense “obvious,” or at least not so surprising or unexpected that many people won’t think of them. In some cases, this is probably because once some new possibility comes along, lots of people think of similar things that could be done with it. Once the properties of electricity began to be understood, many people came up with the idea of using it to send signals (telephone, telegraph), or to create motion (engines and generators), or to generate light (arc lamps, incandescent lights). Once the steam engine came along, lots of people had the idea to use it to power various types of vehicles.
In other cases, multiple invention probably occurs because important problems will attract many people trying to solve them. Steel corrosion was a large problem inspiring many folks to look for ways to create a steel that didn’t rust, or notice the potential value if they stumbled across such a material. Lamps causing mine fires were a major problem, inspiring many people to come up with ideas for safety lamps. The smoke produced by gunpowder was a major problem, inspiring many efforts to develop smokeless powders. And because would-be inventors will all draw from the same pool of available technologies, materials, and capabilities when coming up with a solution, there will be a large degree of convergence in the solutions they come up with…
Fascinating: “How Common is Multiple Invention?” from @const-physics.blogsky.venki.dev.
* common idiom
###
As we reconsider credit, we might recall that it was on this date in 1661 that Isaac Newton— a key figure in the Scientific Revolution and the Enlightenment that followed– entered Trinity College, Cambridge. Soon after Newton obtained his BA degree at Cambridge in August 1665, the university temporarily closed as a precaution against the Great Plague. Although he had been undistinguished as a Cambridge student, his private studies and the years following his bachelor’s degree have been described as “the richest and most productive ever experienced by a scientist.”
Relevantly to the piece above, Newton was party to a dispute with Gottfried Wilhelm Leibniz (who started, at age 14, at the University of Leipzig the same year that Newton matriculated at Cambridge) over which of them developed calculus– called “the greatest advance in mathematics that had taken place since the time of Archimedes.” The modern consensus is that the two men independently developed their ideas.

“The street finds its own uses for things”*…
Your correspondent is off again, this time across borders and for a little longer that my last few absences; regauler service should resume around April 19…
The estimable Matt Webb on an approach to thnking more comprhensively and creatively about the ultimate impacts of and given innovation…
… I recently learnt about twig, which is a biotech startup manufacturing industrial chemicals using custom bacteria.
The two examples they cite: palm oil which is used in lipstick but displaces rainforests; isoprene which is used to make tyres but comes from fossil fuels.
What if instead you could engineer a strain of bacteria to bulk produce these chemicals sustainably?
The capabilities are present in the metabolic pathways. So that’s what twig does. At scale, is the promise.
- I hadn’t realised this kind of biotech had gotten to commercialisation! And in London too. Good stuff.
- What Are The Civilian Applications?
What Are The Civilian Applications? is of course a Culture ship name, a GSV (General Systems Vehicle) from The Use of Weapons by Iain M. Banks.
It is also an oblique strategy we deployed regularly in design workshops back in the day at BERG, introduced (I think? Gang please correct me if I’m wrong) by long-time design leader and friend Matt Jones. That’s his project history. Go have a read.
Let me unpack.
Oblique Strategies (a history) by Brian Eno and Peter Schmidt, 1975: a deck of approx 100 cards, each of which is a prompt to bump you out of a creative hole.
For example:
Honor thy error as a hidden intentionOr:
Discard an axiomAnd so on.
In product invention, which is kinda what we did at BERG and kinda what I do now, it’s handy to carry your own toolkit of prompts. So I adopted What Are The Civilian Applications? into my personal deck of oblique strategies.
Therefore.
What would do you with engineered bacteria that can make palm oil or whatever, if it were cheap enough to play with, if the future were sufficiently distributed, if we all had it at home?
Like, it’s a good question to ask. What would civilians do with engineered bacteria?
Tomato soup.
Instead of buying tomato soup at the store, I’d have a little starter living in a jar. A bioreactor all of my own, and I’d fill it with intelligently designed bacteria that eat slop and excrete ersatz Heinz tomato soup.
I’m not 100% sure what “slop” is in this context. The food I mean. Maybe the bacteria just get energy from sunlight, fix carbon from the air, and I drop in a handful of vitamin gummies or fish flakes every Monday?
A second oblique strategy adopted into my personal deck over the years:
“
A good science fiction story should be able to predict not the automobile but the traffic jam,” by Frederik Pohl. As previously discussed re a national drone network.Let’s say I can go to the store and buy a can of Perpetual Heinz, or however they brand it. A can with a sunroof on the top and a tap on the side that I keep in the garden and I can juice it for soup once a week for a year, or until the bacterial population diverges enough that I’m at risk of brewing neurotoxins or psychedelics or strange and wonderful new flavours or something.
Heinz is not going to like that, economically. They’ll require me to enrol in some kind of printer and printer ink business model where I have to subscribe to the special vitamin pills to keep (a) the soup colony alive and (b) their shareholders happy.
Which will end up being pricey, like the monthly cash we all pay out to mutually incompatible streaming services. Demand will arise for black market FMCGs on the dark web. Jars of illegal Infinite Coca Cola that only requires the cheap generic slop and it tastes just the same.
So I love to play with these strategies and imagine what the world might be like. Each step makes a sort of sense yet you end up somewhere fantastical – that’s the journey I want to take you on in text, too. Then the game, in product invention, is to take those second order possibilities and bring them back to today. (I’m giving away all my secrets now.)
But I prefer cosier, more everyday futures:
Grandma’s secret cake recipe, passed down generation to generation, could be literally passed down: a flat slab of beige ooze kept in a battered pan, DNA-spliced and perfected by guided evolution by her own deft and ancient hands, a roiling wet mass of engineered microbes that slowly scabs over with delicious sponge cake, a delectable crust to be sliced once a week and enjoyed still warm with cream and spoons of pirated jam.
A small jar of precious, proprietary cake ooze handed down parent to child, parent to child, together with a rack filled with the other family starter recipes, a special coming of age moment, a ceremony…
Thinking broadly and deeply about the implications of innovations: “What Are The Civilian Applications?” from @genmon.fyi.
(Image above: source)
* William Gibson
###
As we ponder the particulars of progress, we might spare a thought for Francis Bacon– the English Renaissance philosopher, lawyer, linguist, composer, mathematician, geometer, musician, poet, painter, astronomer, classicist, philosopher, historian, theologian, architect, father of modern empirical science (The Baconian– aka The Scientific– Method), and patron of modern democracy, whom some allege was the illegitimate son of Queen Elizabeth I of England (and other’s, the actual author of Shakespeare’s plays). He died on this date in 1561… after (about a month earlier) he had stuffed a dressed chicken with snow to see how long the flesh could be preserved by the extreme cold. He caught a cold and perished from its complications.
“Nothing is built on stone; all is built on sand”*…

(Roughy) Daily has looked before at that most common– and essential– of substances, sand. (See here, here, and here.) Today, via Michaela Büsse, an update…
After water, sand is the second most used material in the world. Each year, approximately 40-50 billion metric tons of sand are consumed worldwide.
This accounts for 79% of all aggregates extracted and traded, making sand the literal foundation for global human infrastructure. Sand plays a vital role in the production of glass, steel, and concrete. Silica, one of the most common minerals found in sand, is the key ingredient in silicon chips and thus for the development of digital technologies. But sand is also fundamental to the creation and maintenance of land itself, rendering it constitutive to processes of urbanization. Artificial islands, port expansions, and beach nourishment projects consume vast quantities of sand. As the bedrock of urban infrastructures, sand is embedded in the very fabric of modern life. Yet, its ubiquity belies its complexity. As a sediment, sand is foundational for the functioning of ecosystems. The relentless expansion and intensification of cities is starving rivers and coasts of sediment, depleting sand at a rate that far exceeds its natural replenishment.
Intensive dredging of rivers and seabeds has fundamentally altered sedimentation patterns, disrupting the delicate equilibrium that governs ecosystems. Rivers, which once carried sand from mountains to coastlines, now struggle to replenish beaches and wetlands. This depletion has far-reaching consequences. Without sufficient sand deposits, coastlines are left vulnerable to erosion, rising sea levels, and the devastating impact of extreme weather events. In ecosystems already on the front lines of climate change—like deltas, wetlands, and estuaries—the effects of sand extraction are compounded. Delta regions, for instance, rely on continuous sediment deposits to counteract the natural sinking of land. When sand is removed faster than it can be replaced, these regions are exposed to subsidence, where land sinks at an accelerated rate, amplifying flood risk and increasing the salinization of freshwater resources. Such impacts are often delayed, manifesting years or even decades after extraction, making them challenging to monitor and mitigate effectively.
As global sand consumption surges to unprecedented heights, the profound and far-reaching consequences of extraction come sharply into focus. Numerous journalistic and scientific accounts warn of the “looming tragedy of the sand commons,” highlighting environmental concerns related to dredging and mining sand, such as pollution, biodiversity loss, and soil disturbance, as well as illegal practices in the sand trade. The reality of the sand trade is both dirty and messy, intertwining national and transnational politics. In regions like Southeast Asia, rapid urbanization and investments in large-scale infrastructure projects have spurred an unprecedented demand for this essential resource. Here, land reclamation has emerged as a flashpoint where extraction practices intersect with issues of sovereignty, livelihoods, and environmental justice, transforming sand into a highly sought-after and contested commodity. Building new land for some means taking old land from others. The exploitation of sand goes hand in hand with exploitative labor and geopolitical maneuvering.
Sand’s impending scarcity has fueled a black market, giving rise to “sand mafias”—criminal organizations that exploit extraction and trade through corruption, violence, and intimidation, often circumventing national mining and export bans. It is not uncommon for sand to become a matter of life and death for those who mine it as well as for those who seek to prevent it from being mined. Across the world, activists and local communities have mobilized against sand extraction and land reclamation, fighting the prevailing narratives of development and progress that often justify environmental exploitation. However, these initiatives are rarely successful, resulting (at best) in compensation payments to the affected communities. A transboundary governance of sand would require international standards, which many researchers and organizations have requested. Even so, it is nearly impossible to control the natural flow of sand.
As sand transitions from a sediment to a precious resource, it has become instrumental in urban ideals of late modernity. Cities like Dubai and Singapore epitomize how architectural ambitions is built on vast quantities of imported sand. Land built from scratch, towering skyscrapers, and sprawling infrastructure are testaments to sand’s transformative potential. Yet, these urban landscapes are haunted by their materiality: each grain is a silent witness to the ecological and social disruptions that enabled its journey. The sand in these structures embodies the persistence of environmental degradation, displaced labor, and the exploitation that made them possible. In this way, sand is both an architect and a specter of modernity’s unrestrained ambitions, leaving us to confront the shadows cast by our own constructions…
Eminently worth reading in full: “Granular Power: The Gritty Politics of Sand,” from @michaelabussey.bsky.social and @eflux.bsky.social.
* Jorge Luis Borges
###
As we get grainy, we might send insightful birthday greetings to James Hansen; he was born on this date in 1941. An atmospheric physicist, he was Director of the NASA Goddard Institute for Space Studies (from 1981-2013). He is best known for his (June, 1988) testimony to the Senate Energy and Natural Resources Committee that there was 99% certainty the cause of climate change was known with 99% certainty to be the buildup of carbon dioxide and other artificial gases in the atmosphere– helping raise broad awareness of global warming– and for his advocacy of action to avoid dangerous climate change. (Hansen has since proposed a revised explanation of global warming, where the 0.7°C global mean temperature increase of the last 100 years can be to some extent explained by the effect of greenhouse gases other than carbon dioxide (such as methane).
Currently the Director of the Program on Climate Science, Awareness and Solutions of the Earth Institute at Columbia University, he remains a climate activist.
“It is a capital mistake to theorize before one has data”*…
The estimable Claudia Sahm on what the elimination of an obscure advisory committee on economic data says about the administration’s commitment to relevance and accuracy…
In a time of great economic uncertainty, President Donald Trump’s administration quietly took a step last week that could create even more: Secretary of Commerce Howard Lutnick disbanded the Federal Economic Statistics Advisory Committee.
I realize that the shuttering of an obscure statistical advisory committee may not strike anyone as a scandal, much less an outrage. But as an economist who has presented to the committee, known as FESAC, I know how it improved the information used by both the federal government and private enterprise to make economic decisions. Most Americans do not realize how many aspects of their lives rely on timely and accurate government data.
One of FESAC’s official responsibilities was “exploring ways to enhance the agencies’ economic indicators to make them timelier, more accurate, and more specific to meeting changing demands and future data needs.” In the complex and highly dynamic US economy, this is an ongoing effort — not a one-time task that has been “fulfilled,” which was the Commerce Department’s stated reason for terminating the committee.
The 15 members of the advisory committee, who were unpaid, brought deep technical expertise on economic measurement from the private sector, academia and the non-profit world. They were a sounding board for the Census Bureau, Bureau of Labor Statistics, and Bureau of Economic Analysis, which produce much of the nation’s official statistics.
If statistics fail to keep up with the changing economy, they lose their usefulness. When the committee last met in December, one focus was on measuring the use and production of artificial intelligence. Staff from the agencies shared existing findings on AI, such as from the Business Trends and Outlook Survey that began in 2022, and outlined new data collection efforts. AI’s current use among businesses has nearly doubled since late 2023, and even more businesses expect to adopt AI in the next six months.
The committee was asked what data products would be most useful. Expert feedback, including a request to harmonize the definitions of AI across surveys and align with cutting-edge research, is especially valuable at the early stages of data collection. The growth and employment effects of AI are among the most pressing questions facing the economy, and external experts are crucial to supporting the creation of high-quality data.
Enhancing official economic statistics under budget constraints often requires creative approaches. At its meeting last June, the committee discussed using private-sector data to create statistics on regional employment and other outcomes. There is considerable demand among businesses and local governments to have timely geographic detail, but it is cost-prohibitive with current government surveys. Members of FESAC, some of whom work at companies like Indeed and JP Morgan Chase, offered first-hand knowledge of the pros and cons of using private-sector data.
The committee contributed far more than just twice-a-year meetings. It also created relationships with the private sector that government agencies could draw on as part of their continuing effort to improve their statistics.
The National Academies of Sciences, in discussing best practices for statistical agencies, argues that external advisory committees are a good way to engage with users of the data and obtain expert advice. Moreover, external evaluation should be part of regular program reviews to ensure quality, relevance and cost-effectiveness. That’s exactly what FESAC did.
The statistical agencies need more, not fewer, resources now to meet their challenges. During the campaign, Trump repeatedly questioned the credibility of US employment statistics. In particular, he claimed that the downward revisions of monthly payrolls showed political interference. Senators Bill Cassidy and Susan Collins asked the Bureau of Labor Statistics to explain why large revisions were happening and how to avoid them. FESAC could have been a valuable resource for possible improvements.
Disbanding FESAC does not advance the administration’s goal of greater efficiency in the government. In 2024, the committee’s cost was expected to be a modest $120,000, covering travel expenses and minimal staff support. Virtual-only meetings could have reduced those costs still further, if that was a concern. Regardless, the benefits to the millions of data users from regular reviews by external experts far exceed that negligible cost.
Putting a low-cost, high-value committee on the chopping block does not bode well for other investments in the official statistics. Reductions in staff and budget would likely degrade the quality of the official statistics. Even before Trump took office, all three agencies operated in a tight budget environment.
Reduced transparency in official statistics is perhaps the most troubling aspect of disbanding FESAC. Cutting off agency staff from external advisers creates an environment where political interference could occur much more easily — and go undetected. With political officials such as Lutnick arguing publicly that GDP should exclude government spending, it is especially important to have external, independent experts.
And FESAC is not alone. By executive order, the administration is ending several advisory committees in the federal government, reducing transparency and the technical resources for agencies. It’s a short-sighted approach that could undermine essential government services…
“The War on Government Statistics Has Quietly Begun” (gift link) from @claudia-sahm.bsky.social in @bloomberg.com.
Apposite: “The True Cost of Trump’s Cuts to NOAA and NASA,” “Trump’s shocking purge of public health data, explained,” and “Trump USDA Sued for Erasing Webpages Vital to Farmers“… and so many– too many– others.
(Image above: source… note how many of the data sources cited are are precisely the sorts of government resources being targeted)
* Sherlock Holmes (Arthur Conan Doyle)
###
As we drive with our windows painted over, we might send understanding birthday greetings to Robert Heilbroner; he was born on this date in 1919. An economist and historian of economic thought, he was the author of some two dozen books, the best known of which is The Worldly Philosophers: The Lives, Times and Ideas of the Great Economic Thinkers (1953), a remarkable survey of the lives and contributions of famous economists (perhaps most notably Adam Smith, Karl Marx, and John Maynard Keynes). Your correspondent can also recommend The Future as History (1960).
Heilbroner was considered highly unconventional by those in his field; indeed, he regarded himself a social theorist and “worldly philosopher” (philosopher pre-occupied with “worldly” affairs, such as economic structures) and tended to integrate the disciplines of history, economics, and philosophy into his work. Nonetheless, Heilbroner was recognized by his peers as a prominent economist and was elected vice president of the American Economic Association in 1972.







You must be logged in to post a comment.