(Roughly) Daily

Posts Tagged ‘research

“We need to return from the self-centered concept of sincerity to the other-centered concept of truth”*…

Research universities have been central to the accomplishments of “The American Century.” Their work has laid the foundation for major advances in health and medicine, technology, communications, agriculture/food, economics, energy, and national security at the same time that they have educated students to be scientific leaders and innovators.

Research universities originated in Prussia in the early 19th century (animated by Wilhelm von Humboldt‘s vision of Einheit von Lehre und Forschung [the unity of teaching and research]). And indeed, into the early 1930s the world’s leading research universities were in Germany.

As historian Roger L. Geiger has explained, “the model for the American research university was established by five of the nine colonial colleges chartered before the American Revolution (Harvard, Yale, Pennsylvania, Princeton, and Columbia); five state universities (Michigan, Wisconsin, Minnesota, Illinois, and California); and five private institutions conceived from their inception as research universities (MIT, Cornell, Johns Hopkins, Stanford, and Chicago).” The American research university first emerged in the late 19th century, when these fifteen institutions began to graft graduate programs derived from the German model onto undergraduate programs derived from the British model.

By 1960, U.S. research universities had become the global model; they still dominate the top of global university rankings (see, e.g., here, here, and here).

But as Nils Gilman explains, their pivotal role is in jeaopardy…

I wrote about MAGA’s coming assault on US higher education in the first week of the new administration. Here’s a brief update.

Within two weeks of the new regime taking office, the National Institutes of Health lost its director and deputy director and the new leadership announced that NIH was axing overhead costs on research grants — the operational lifeblood of large research universities. (Like everything else Team Trump has been trying to do, this effort caught up in litigation, through who knows whether the Trumpniks will pay attention to adverse rulings.) Should it move forward, these moves will kill the golden goose of US biomedical research.

The regime has also been threatening to withhold federal funding from colleges that do not kill programs at odds with the MAGA agenda, including abolishing DEI efforts and banning transgender athletes from participating in women’s sports. Two new fronts were opened last week with a direct targeting of Columbia University, ostensibly because of its coddling of Gaza protestors last year — including the demand that it place one of its academic departments in receivership. And on Friday, the regime announced investigations into 52 universities as part of its DEI crackdown. All of this entails a viewpoint-based assault on academic freedom that is unprecedented in American history, way beyond even McCarthyism — though, as my friend John Aubrey Douglas wrote a couple of years ago, it is a standard part of the modern authoritarian playbook.

Five brief observations:

  1. The MAGA axe is falling on all fields whose Wissenschaft is at odds with the ideological agenda of some faction of MAGA. This includes not just definitionally “woke” departments like ethnic and gender studies, but will envelop the whole of the humanities, as well as biomedical research and climate science.
  2. True to its nativism, the new regime seems especially keen on reining in transnational scientific collaboration. Furthermore, as MAGA closes off travel from much of the world, holding major conferences in the US will become physically impossible (not to mention intellectually indefensible, as John Quiggan says).
  3. As of yet, there appears to be little if any coordinated pushback against any of this, either politically or even as an industry. If you look at the websites of the Association of Public and Land Grant Universities, the International Association of University Presidents, the American Association of Colleges & Universities, or other similar organizations, you’d have no idea that anything untoward is happening. Incredibly, despite the unabashed way MAGA telegraphed its intention to do all of this, university leaderships appear to be totally unprepared. What I take this to mean is that it’s going to happen, more or less unopposed.
  4. I expect that the evisceration of US research universities will spell the end of the long US dominance of research publications. It will also hasten the collapse of peer review — already in trouble for several reasons, including the replicability crisis and a business model predicated on free labor from US academics — as the standard quality control mechanism for scholarship.
  5. It is a sign of the times that several French research organizations and universities are now beginning to offer landing spots for US academics who find they “can no longer pursue their activities in their country of origin due to wars, political persecution, or censorship.” This may save a few individual careers, but obviously it’s not going to work for more than a tiny fraction of the half million+ ladder rank faculty in the country.

These are just ongoing field notes from the front, so I don’t have any big conclusions around this, so I’ll just repeat what I’ve been saying on this topic for years:

In 1933, German research universities were by every measure the greatest in the world. This intellectual power was in turn a cornerstone of German industrial and ultimately military might. In a few short years, however, using tactics not dissimilar to the ones listed above, the Nazis destroyed them — not least because the universities themselves went along with what was being done to them (Selbst-Gleichschaltung, as it was known). And nearly a century on, German universities have still not recovered, despite many proposed efforts — and neither has Germany’s prestige or power.

Self-inflicted wounds: “The MAGA assault on US universities has begun in earnest,” and, @nilsgilman.bsky.social suggests, it appears that the universities are going passively to surrender.

A case-in-point attack, with a suggested response: “What should be Columbia University’s legal answer to the extortionate & unconstitutional demands of the Trump administration.”

See also: “First they came for Columbia” (and this different piece with the same title), “The Crumbling Foundations of American Strength,” and “The Economic Danger of Disinvesting in Education.”

(Image above: source)

* Iris Murdoch

###

As we re-educate, we might recall that it was on this date in 2003 that President George W. Bush announced the invasion of Iraq, the start of the Iraq War (AKA, the Second Gulf War). 22 days later, American (and Allied) forces took Baghdad, and on May 1, President Bush announced “Mission Accomplished.” In the event, U.S. military forces remained in Iraq until their withdrawal in 2011.

source

“Google will bring you back 100,000 answers. A librarian will bring you back the right one.”*…

A reference desk in a Michigan library, 1980s

Stephen Akey remembers the time before online search…

How do you find the life expectancy of a California condor? Google it. Or the gross national product of Morocco? Google it. Or the final resting place of Tom Paine? Google it. There was a time, however—not all that long ago—when you couldn’t Google it or ask Siri or whatever cyber equivalent comes next. You had to do it the hard way—by consulting reference books, indexes, catalogs, almanacs, statistical abstracts, and myriad other printed sources. Or you could save yourself all that time and trouble by taking the easiest available shortcut: You could call me.

From 1984 to 1988, I worked in the Telephone Reference Division of the Brooklyn Public Library. My seven or eight colleagues and I spent the days (and nights) answering exactly such questions. Our callers were as various as New York City itself: copyeditors, fact checkers, game show aspirants, journalists, bill collectors, bet settlers, police detectives, students and teachers, the idly curious, the lonely and loquacious, the park bench crazies, the nervously apprehensive. (This last category comprised many anxious patients about to undergo surgery who called us for background checks on their doctors.) There were telephone reference divisions in libraries all over the country, but this being New York City, we were an unusually large one with an unusually heavy volume of calls. And if I may say so, we were one of the best. More than one caller told me that we were a legend in the world of New York magazine publishing.

“How do you people know all this stuff?” a caller once asked me. “What are you, some kind of scholars or wordsmiths or something?”

“No,” I replied. “Just us libarians.”

Actually, we didn’t know all that stuff; we just knew how to find it. I myself rarely remembered any of the facts I divulged to our callers, but I remembered the reference sources where I found the facts. Personal knowledge was inadmissible. I could reel off by heart the names of the four Dead Boys (Cheetah Chrome, Stiv Bators, Jimmy Zero, and Johnny Blitz—but didn’t everyone know that?), but unless I could track them down and—rule number one—cite the source (in this case, probably the Rolling Stone Encyclopedia of Rock and Roll), I had no information to impart and no answer to give to anyone who might need that information for whatever reason. But we almost always found the right source.

The progenitor and enforcer of rule number one was our department head, whose managerial style recalled that of Vince Lombardi, if Vince Lombardi had had no interest in football. I do wish Milo had been a tad less heavy-handed; he tended to reduce unsatisfactory initiates to tears before driving them from the department for keeps. Nevertheless, his grinding relentlessness, which often entailed instructions barked into one ear while one’s other ear might be dealing with a difficult and demanding caller, was in the service of professionalism and competence—necessary qualities in a small, claustrophobic office where the pressure from our backlog of callers never let up.

“Are you that nice young man who always goes out of his way to find me exactly the answers I need to the questions I ask?” a caller once asked me as a prelude to her inquiry.

“Doesn’t sound like me,” I said.

There was always psychology involved. In this case, the caller thought that by flattering me she might induce me to break or bend our rule of five minutes or three questions max, which we routinely disregarded anyway. The opposite psychological ploys—bullying, intimidating, insulting, threatening—were far more common. Contrary to the popular perception of librarianship as a serene, leisurely vocation for the bookishly inclined, the Telephone Reference Division was a high-stress environment, and most staffers, myself included, burned out within a few years. Now that reference librarianship is a shadow of its former self, psychological gamesmanship rarely takes place. You look up your information in a bland, seemingly (seemingly) trustworthy source like Wikipedia, and that’s that. Librarians have other things to do, principally programming a never-ending stream of ostentatiously unlibrary-like events, but none will ever be so interesting or so much fun as the kind of thing we did in Telephone Reference before the Internet swept it all away.

Did Charon row or pole the souls of the dead across the River Styx? Can you give me the names and addresses of manufacturers of prosthetic devices in Massachusetts? Where are the manuscripts of the composer Marc Blitzstein to be found? (The person asking that question, much to the excitement of my balletomane boss Milo, identified herself as a certain Agnes de Mille.) What was the first language ever spoken? (“Anywhere? At any time?” I asked the caller. “Yes,” she replied, before I suggested that we might try to reformulate the question.) On and on it went. Of course, what we were doing, millions of others were doing on their own without the intercession of any librarian. All of us were negotiating an informational world without algorithmic search engines. Although I hang on to some battered dictionaries and reference books, I resort to Google as readily as anyone else. Undoubtedly, much more has been gained than lost in the transformation of laborious research into something immediate, accessible, and available to everyone. Still, a world that has tossed out the scholarly, comprehensive, and authoritative print edition of the Encyclopaedia Britannica in favor of the colorless, death-by-a-thousand-edits mediocrity of Wikipedia is not necessarily a richer one.

Even without my nostalgia for certain antiquated and specialized reference books (Kane’s Famous First Facts, the Encyclopedia of Associations, Brewer’s Dictionary of Phrase and Fable), I do think certain advantages accrued to analog ways of gathering information. The many hundreds of reference sources that we were trained to use in Telephone Reference had their biases, their blind spots, their inaccuracies. In the apprenticeship each of us endured under Milo’s exhausting tutelage before getting anywhere near a telephone, we learned not merely how to find information but how to think about finding information. Don’t take anything for granted; don’t trust your memory; look for the context; put two and three and four sources together, if necessary. Sometimes it was difficult to communicate such variables to our callers, who just wanted a quick answer rather than a disquisition on the mistaken assumption that the transmission of information was a straightforward matter. How many laundromats were owned and operated by women in California and Oregon in the 1930s? To answer that question, someone would have had to gather and compile that information at the time, and there was no reason to believe that anyone would have thought to do so. Maybe some obscure state agency did tabulate all those female laundromat owners and I simply fumbled an answerable inquiry, but if so, that agency would have been thinking like a gender-conscious individual from the 1980s rather than a government bureaucracy from the 1930s.

“Think like a librarian,” Milo used to urge us, which might sound less impressive than “Think like a philosopher,” “Think like a psychologist,” or even “Think like a lawyer,” but it did make the point that information wasn’t given, that it had to be actively sought…

… A certain esprit de corps facilitated the work and even diffused tensions in that pressure cooker of an office. I knew a lot about rock-and-roll and spoke Spanish. Aaron had a law degree and took all the questions about legal research that stumped us. (He also dispensed free legal advice on occasion, until Milo put a stop to it.) Milo knew theater; Paul was francophone; Kathleen knew movies and pop culture. (Our preferences skewed arty left-of-center, which was inevitable in our milieu.) Sometimes we worked backward, pooling what we already knew to find the reference sources that would confirm (and occasionally contradict) the foregone conclusion. Another rule: Don’t hide your ignorance. There was no Google to cover up the gaps in our knowledge. Sally Jessy Raphael might have been the prime minister of New Zealand or she might have been an exceedingly unctuous talk show host. Unless I asked who she was (the latter, not the former), I wouldn’t know the best sources to check to find her place of birth. As expected, the caller who asked about Ms. Raphael spent a certain amount of time insulting me for my ignorance, but she got her answer.

Many of our callers were historical novelists. Some of them identified themselves as such, but it was usually obvious even when they didn’t. They tended to ask questions like “What time was low tide in Boston Harbor on May 14, 1932?”

If today I were writing a historical novel set in the 1980s, I might ask, “How did people find information in those days?” There would no longer be any telephone reference librarians to help me, so I’d have to trust to luck—and a search engine—and answer that question myself: They used logic, inference, imagination, and a tall pile of reference books…

Dispatches from the telephone reference desk: “The Department of Everything,” in @hedgehogreview.

* Neil Gaiman

###

As we stroll down memory lane, we might send thoughtfully-retrieved birthday greetings to we might send learned birthday greetings to Daniel Boorstin; he was born on this date in 1914.  As a Rhodes Scholar, Boorstin took first-class honors in jurisprudence at Oxford and was admitted as a barrister to the Inner Temple in 1937.  Two years later, he returned to the US to teach history, first at Harvard, then at the University of Chicago.  He left Chicago in 1969 to become the director of the National Museum of American History at the Smithsonian Institution. He’s probably best-known for his three-volume history, The Americans, the third volume of which, The Americans: The Democratic Experience, won the Pulitzer Prize in 1974.

In 1975 Boorstin became the Librarian of Congress, a post he held until 1987. The de facto national library of the United States, the LoC is the oldest federal cultural institution in the U.S. It executes its primary mission of informing legislation through researching inquiries made by members of Congress via its version of (an enhanced) reference desk, the Congressional Research Service. (The library is open to the public for research, although only members of Congress, Congressional staff, and library employees may borrow books and materials for use outside of the library.)

source

Written by (Roughly) Daily

October 1, 2024 at 1:00 am

“Having to read footnotes resembles having to go downstairs to answer the door while in the midst of making love”*…

Gertrude Himmelfarb begs to differ: “The footnote would seem to be the smallest detail in a work of history. Yet it carries a large burden of responsibility, testifying to the validity of the work, the integrity (and the humility) of the historian, and to the dignity of the discipline.”

Matthew Wills channels the estimable Anthony Grafton in defense of the oft-maligned marginalia…

The history of the footnote may well seem an apocalyptically trivial topic,” writes historian Anthony Grafton. “Footnotes seem to rank among the most colorless and uninteresting features of historical practice.” And yet, Grafton—who has also written The Footnote: A Curious History (1999)—argues that they’re actually pretty important.

“Once the historian writes with footnotes, historical narrative becomes a distinctly modern” practice, Grafton explains. History is no longer a matter of rumor, unsubstantiated opinion, or whim.

“The text persuades, the note proves,” he avers. Footnotes do double duty, for they also “persuade as well as prove” and open up the work to a multitude of voices.

Leopold von Ranke (1795–1886), the founder of source-based history, is usually credited with the “invention” of the scholarly footnote in the European tradition. Grafton describes von Ranke’s theory as sharper than his practice: his footnoting was much too sloppy to be a model for scholars today. But various forms of footnotes were used long before von Ranke. Sources were of vital importance to both Roman lawyers and Christian theologians in late antiquity and the early Middle Ages, as they strove to back up their own arguments with the weight and gravitas of others…

The history– and importance– of annotation: “History’s Footnotes,” @scaliger via @JSTOR_Daily.

* Noel Coward

###

As we check our references, we might spare a thought for James MacGregor Burns; he died on this date in 2014. A historian and political scientist, he is best known for his biographies of American Presidents; his work on America’s 32nd president, Roosevelt: The Soldier of Freedom, won both the Pulitzer Prize and the National Book Award in History and Biography in 1971.

His work was influential in the field of leadership studies, shifting its focus from the traits and actions of “great men” to the interaction of leaders and their constituencies as collaborators working toward mutual benefit.

source

“Nature does not hurry, yet everything is accomplished”*…

Paul Constance on a chain of dedicated scientists who are building data sets on the natural world around us, and how– coupled with emerging new nature apps that enable citizen scientists– they are expanding our ecological attention span into the long now…

Every two weeks from March to November, Chris Halsch walks a ten-mile loop near the Donner Pass, high up in California’s Sierra Nevada, for the sole purpose of counting butterflies.

It is one of five sites at various altitudes that Halsch, a PhD candidate at the University of Nevada, Reno, has been visiting with metronomic regularity for the past five years. At each one he retraces his steps, pausing every so often to jot down species and numbers in a notebook.

Along the way, he sometimes meets recreational birders or hikers who take photographs and use nature apps to identify species for fun. But unlike those random snapshots, Halsch’s notes are a coveted resource for scientists. Once he types them into a spreadsheet, each of his data points adds a new segment to a chain of observations that has been growing without interruption for half a century, in one the world’s longest-lived efforts to monitor butterfly populations. Like a relay runner, Halsch is extending a marathon of sustained attention that began 20 years before he was born.

Multi-decadal time-series of field observations are among the rarest and most valuable artifacts in ecosystem science because they help to overcome a peculiar weakness in our ability to perceive and interpret the natural world. Humans have developed powerful methods for reconstructing events in the distant past, from the birth of a galaxy to mass extinctions in the Devonian. We have built instruments that can parse the present down to the zeptosecond. But when it comes to the modest timescale of our own lifespans, we are like near-sighted moles.

Weren’t there more birds in this meadow when we were kids?

Doesn’t it seem like spring is a lot rainier than it used to be?

Are you sure it’s safe to eat fish from this river?

Our answers to these types of questions are notoriously unreliable. Think of the tendency to describe a single weather event as evidence for (or against) climate change, or the panic caused by invasive zebra mussels that, 20 years later, turns out to have been misplaced. Perceptions are distorted by selective memories, cognitive biases, political agendas and shifting baseline syndrome—the propensity of each generation to gradually forget past environmental conditions and accept present ones as normal. In an essay published in 01990, the zoologist John J. Magnuson wrote that this temporal myopia can trap us in the “invisible present,” a space where we fail to see slow changes and are unable to interpret effects that lag years behind their causes. “In the absence of the temporal context provided by long-term research, serious misjudgments can occur not only in our attempts to understand and predict change in the world around us, but also in our attempts to manage our environment,” he warned.

Magnuson was echoing a group of mid-century scientists who believed that some of the biggest questions in ecology could only be answered with field observations that were carefully structured and repeated at the same sites for at least two decades. The longer the time-series, the greater likelihood that the invisible present will “melt away,” exposing the complex and often unexpected dynamics of ecosystem change….

[Constance describes a variety of efforts underway…]

… Collectively, these efforts are widening the aperture of our ecological attention, enabling scientists to find and stitch together scattered fragments of temporal data into panoramas that tell a more illuminating story about the interactions that drive change. Unfortunately, the emerging picture is still largely focused on wealthy countries—particularly ones with long histories of field-based science. A map of the International LTER network, an association of 750 field stations that, like their U.S. counterparts, are making long-term observations, shows that more than two thirds are concentrated in Western Europe. Numerous countries in Asia, Africa and Latin America have no stations at all. Moreover, even as scientists like Moran and Grames are exploiting the new wealth of temporal evidence, it is not clear how this research will influence the wider culture, where the blinkered perceptions of the “invisible present” are still pervasive.

The trend that may ultimately overcome both of these limitations is driven, paradoxically, by smartphones. Non-scientists have long been a critical source of field labor for long time-series, most famously for the Audubon Society’s 122-year-old Christmas Bird Count, but also in hundreds of smaller projects that monitor other kinds of flora and fauna.  Now, smartphones with powerful cameras and apps such as eBird, iNaturalist, Seek and Picture Insect have enabled millions of casual observers to supplement this pool of dedicated volunteers. Despite the lively debate on whether smartphone usage in the outdoors enhances or interferes with people’s appreciation of nature, one fact is clear: because nature apps automatically time-stamp, geo-reference and store each observation in a robust database, they are generating potential time-series at an unprecedented scale.

In the 20 years since the Cornell Lab of Ornithology launched eBird, the app has amassed more than one billion observations by 700,000 birders from every country in the world. Carrie Seltzer, who heads stakeholder engagement at iNaturalist, says that more than 2.4 million people have made observations on the app, at a rate that has grown between 50 percent and 100 percent per year since 02012… This torrent of raw field data vastly exceeds what even well-funded researchers could ever dream of gathering with traditional methods…

Understanding the world around us: “Peering into The Invisible Present,” from @presentbias and @longnow. Eminently worth reading in full– then browsing the other remarkable pieces available on the Long Now website.

* Lao Tzu

###

As we take the long view, we might send insightful birthday greetings to a man who encourages us to see in different ways, M. C. Escher; he was born on this date in 1898. A graphic artist inspired by mathematics, he created woodcuts, lithographs, and mezzotints, that— while largely ignored by the art world in his lifetime, have become widely celebrated. He’s been recognized as an heir to Parmigianino, Hogarth, and Piranesi.

His work features mathematical objects and operations including impossible objects, explorations of infinity, reflectionsymmetryperspectivetruncated and stellated polyhedrahyperbolic geometry, and tessellations. And though Escher believed he had no mathematical ability, he interacted with the mathematicians George PólyaRoger Penrose, and Donald Coxeter, and the crystallographer Friedrich Haag, and conducted his own research into tessellation.

For more on (and more examples of) Escher’s work, see here.

Helix, v.3, no.7 (source)
M. C. Escher (source)

“The clustering of technological innovation in time and space helps explain both the uneven growth among nations and the rise and decline of hegemonic powers”*…

As scholars like Robert Gordon and Tyler Cowan have begun to call out a slowing of progress and growth in the U.S., others are beginning to wonder if “innovation clusters” like Silicon Valley are still advantageous. For example, Brian J. Asquith

In 2011, the economist Tyler Cowen published The Great Stagnation, a short treatise with a provocative hypothesis. Cowen challenged his audience to look beyond the gleam of the internet and personal compu­ting, arguing that these innovations masked a more troubling reality. Cowen contended that, since the 1970s, there has been a marked stagna­tion in critical economic indicators: median family income, total factor productivity growth, and average annual GDP growth have all plateaued…

In the years since the publication of the Great Stagnation hypothesis, others have stepped forward to offer support for this theory. Robert Gordon’s 2017 The Rise and Fall of American Growth chronicles in engrossing detail the beginnings of the Second Industrial Revolution in the United States, starting around 1870, the acceleration of growth spanning the 1920–70 period, and then a general slowdown and stagnation since about 1970. Gordon’s key finding is that, while the growth rate of average total factor productivity from 1920 to 1970 was 1.9 percent, it was just 0.6 percent from 1970 to 2014, where 1970 represents a secular trend break for reasons still not entirely understood. Cowen’s and Gordon’s insights have since been further corroborated by numerous research papers. Research productivity across a variety of measures (researchers per paper, R&D spending needed to maintain existing growth rates, etc.) has been on the decline across the developed world. Languishing productivity growth extends beyond research-intensive industries. In sectors such as construction, the value added per worker was 40 percent lower in 2020 than it was in 1970. The trend is mirrored in firm productivity growth, where a small number of superstar firms see exceptionally strong growth and the rest of the distribution increasingly lags behind.

A 2020 article by Nicholas Bloom and three coauthors in the American Economic Review cut right to the chase by asking, “Are Ideas Getting Harder to Find?,” and answered its own question in the affirm­ative.6 Depending on the data source, the authors find that while the number of researchers has grown sharply, output per researcher has declined sharply, leading aggregate research productivity to decline by 5 percent per year.

This stagnation should elicit greater surprise and concern because it persists despite advanced economies adhering to the established eco­nomics prescription intended to boost growth and inno­vation rates: (1) promote mass higher education, (2) identify particularly bright young people via standardized testing and direct them to re­search‑intensive universities, and (3) pipe basic research grants through the university system to foster locally-driven research and development networks that supercharge productivity…

… the tech cluster phenomenon stands out because there is a fundamental discrepancy between how the clusters function in practice versus their theoretical contributions to greater growth rates. The emergence of tech clusters has been celebrated by many leading economists because of a range of findings that innovative people become more productive (by various metrics) when they work in the same location as other talented people in the same field. In this telling, the essence of innovation can be boiled down to three things: co-location, co-location, co-location. No other urban form seems to facili­tate innovation like a cluster of interconnected researchers and firms.

This line of reasoning yields a straightforward syllogism: technology clusters enhance individual innovation and productivity. The local na­ture of innovation notwithstanding, technologies developed within these clusters can be adopted and enjoyed globally. Thus, while not everyone can live in a tech cluster, individuals worldwide benefit from new advances and innovations generated there, and some of the outsized economic gains the clusters produce can then be redistributed to people outside of the clusters to smooth over any lingering inequalities. There­fore, any policy that weakens these tech clusters leads to a diminished rate of innovation and leaves humanity as a whole poorer.

Yet the fact that the emergence of the tech clusters has also coincided with Cowen’s Great Stagnation raises certain questions. Are there shortcomings in the empirical evidence on the effects of the tech clusters? Does technology really diffuse across the rest of the economy as many economists assume? Do the tech clusters inherently prioritize welfare-enhancing technologies? Is there some role for federal or state action to improve the situation? Clusters are not unique to the postwar period: Detroit famously achieved a large agglomeration economy based on automobiles in the early twentieth century, and several authors have drawn parallels between the ascents of Detroit and Silicon Valley. What makes today’s tech clusters distinct from past ones? The fact that the tech clusters have not yielded the same society-enhancing benefits that they once promised should invite further scrutiny…

How could this be? What can we do about it? Eminently worth reading in full: “Superstars or Black Holes: Are Tech Clusters Causing Stagnation?” (possible soft paywall), from @basquith827.

See also: Brad DeLong, on comments from Eric Schmidt: “That an externality market failure is partly counterbalanced and offset by a behavioral-irrationality-herd-mania cognitive failure is a fact about the world. But it does not mean that we should not be thinking and working very hard to build a better system—or that those who profit mightily from herd mania on the part of others should feel good about themselves.”

* Robert Gilpin

###

As we contemplate co-location, we might recall that it was on this date in 1956 that a denizen of one of America’s leading tech/innovation hubs, Jay Forrester at MIT [see here and here], was awarded a patent for his coincident current magnetic core memory (Patent No. 2,736,880). Forrester’s invention, a “multicoordinate digital information storage device,” became the standard memory device for digital computers until supplanted by solid state (semiconductor) RAM in the mid-1970s.

source