(Roughly) Daily

Posts Tagged ‘Alexander Hamilton

“Memory resides not just in brains but in every cell”*…

An artistic representation of a cell illustrated with two faces merging in its center, surrounded by cellular structures like mitochondria and various organelles, set against a gradient background of soft colors.

As the redoubtable Claire L. Evans [and here] reports, a small but enthusiastic group of neuroscientists is exhuming overlooked experiments and performing new ones to explore whether cells record past experiences — fundamentally challenging our understanding of what memory is…

In 1983, the octogenarian geneticist Barbara McClintock stood at the lectern of the Karolinska Institute in Stockholm. She was famously publicity averse — nearly a hermit — but it’s customary for people to speak when they’re awarded a Nobel Prize, so she delivered a halting account of the experiments that had led to her discovery, in the early 1950s, of how DNA sequences can relocate across the genome. Near the end of the speech, blinking through wire-framed glasses, she changed the subject, asking: “What does a cell know of itself?”

McClintock had a reputation for eccentricity. Still, her question seemed more likely to come from a philosopher than a plant geneticist. She went on to describe lab experiments in which she had seen plant cells respond in a “thoughtful manner.” Faced with unexpected stress, they seemed to adjust in ways that were “beyond our present ability to fathom.” What does a cell know of itself? It would be the work of future biologists, she said, to find out.

Forty years later, McClintock’s question hasn’t lost its potency. Some of those future biologists are now hard at work unpacking what “knowing” might mean for a single cell, as they hunt for signs of basic cognitive phenomena — like the ability to remember and learn — in unicellular creatures and nonneural human cells alike. Science has long taken the view that a multicellular nervous system is a prerequisite for such abilities, but new research is revealing that single cells, too, keep a record of their experiences for what appear to be adaptive purposes.

In a provocative study published in Nature Communications late last year, the neuroscientist Nikolay Kukushkin and his mentor Thomas J. Carew at New York University showed that human kidney cells growing in a dish can “remember” patterns of chemical signals when they’re presented at regularly spaced intervals — a memory phenomenon common to all animals, but unseen outside the nervous system until now. Kukushkin is part of a small but enthusiastic cohort of researchers studying “aneural,” or brainless, forms of memory. What does a cell know of itself? So far, their research suggests that the answer to McClintock’s question might be: much more than you think…

[Evans explains the prevailing wisdom, outlines the experiments that have challenged it, and unpacks (at least some reasons for) resistance to the notion of cellular-scale memory, both sociological and semantic…]

… In neuroscience, [biochemist and neuroscientist Nikolay] Kukushkin writes, the most common definition of memory is that it’s what remains after experience to change future behavior. This is a behavioral definition; the only way to measure it is to observe that future behavior. Think of S. roeselii snapping back into its holdfast, or a lab rat freezing up at the sight of an electrified maze it’s tangled with before. In these cases, how an organism reacts is a clue that prior experience left a lingering trace.

But is a memory only a memory when it’s associated with an external behavior? “It seems like an arbitrary thing to decide,” Kukushkin said. “I understand why it was historically decided to be that, because [behavior] is the thing you can measure easily when you’re working with an animal. I think what happened is that behavior started as something that you could measure, and then it ended up being the definition of memory.”

Behavior tells us that a memory has formed but says nothing about why, how or where. Further, it’s constrained by scale. Take Aplysia californica, a muscular sea slug with enormous neurons (the largest is about the size of a letter on a U.S. penny). Neuroscientists love to conduct memory experiments on Aplysia because its physical responses are easy to measure — poke it and it flinches — and they map cleanly to the handful of sensory and motor neurons involved.

The sea slug, Kukushkin said, can complicate neuroscience’s behavioral bias. Say you shock its tail, triggering a defensive reflex. If you shock it again the next day and find that the defensive reflex is stronger than it was before, that’s behavioral evidence that the slug remembers its initial shock. Any neuroscientist would associate it with a memory.

But what if (apologies to the squeamish) you take that sea slug apart and leave only its immobile neurons? Unlike the intact creature, the neurons can’t retract, so there will be no visible response. Is the memory gone? Certainly not, but without external validation, a behavioral definition of memory breaks down. “We no longer call that a memory,” Kukushkin said. “We call that a mechanism for a memory, we call that synaptic change underlying memory, we call that an analogue of memory. But we don’t call that a memory, and I think that it’s arbitrary.”

Perhaps a definition of memory should extend beyond behavior to encompass more records of the past. A vaccination is a kind of memory. So is a scar, a child, a book. “If you make a footprint, it’s a memory,” Gershman said. An interpretation of memory as a physical event — as a mark made on the world, or on the self — would encompass the biochemical changes that occur within a cell. “Biological systems have evolved to harness those physical processes that retain information and use them for their own purposes,” [cognitive scientist Sam] Gershman said.

So, what does a cell know of itself? Perhaps a better version of Barbara McClintock’s question is: What can a cell remember? When it comes to survival, what a cell knows of itself isn’t as important as what it knows of the world: how it incorporates information about its experiences to determine when to bend, when to battle and when to make a break for it.

A cell preserves the information that preserves its existence. And in a sense, so do we. As today’s cellular memory researchers revisit abandoned experimental threads from the past, they too are discovering what memory owes to its context, how science’s sociological environment can determine which ideas are preserved and which are forgotten. It’s almost as though a field is waking up from a 50-year spell of amnesia. Fortunately, the memories are flooding back…

What Can a Cell Remember?” from @theuniverse.bsky.social‬ in @quantamagazine.bsky.social‬.

For more on the work that got Barbara McClintock onto the Nobel podium see here.

And, also apposite, a pair of cautionary historical examples of scientists who followed Jean-Baptiste Lamarck, who argued in the early 19th century that an organism can pass on to its offspring physical characteristics that the parent organism acquired through use or disuse during its lifetime– that’s to say that learning (a kind of memory) is heritable… and went astray: Lysenko and Kammerer.

* James Gleick, The Information

###

As we muse on memory (and note that one cannot remember– and learn from– what one cannot know), we might recall that it was on this date in 1735 that New York Weekly Journal publisher and writer John Peter Zenger was acquitted of seditious libel against the royal governor of New York, William Cosby, on the basis that what he had published was true.

In 1733, Zenger had begun printing The New York Weekly Journal, voicing opinions critical of the colonial governor.  On November 17, 1734, on Cosby’s orders, the sheriff arrested Zenger. After a grand jury refused to indict him, the Attorney General Richard Bradley charged him with libel. Zenger’s lawyers, Andrew Hamilton and William Smith, Sr., successfully argued that truth is a defense against charges of libel… and Zenger became a symbol for freedom of the press.

An illustration depicting a courtroom scene with a judge, lawyers, and an audience, capturing the atmosphere of a historical trial.
Andrew Hamilton defending John Peter Zenger in court, 1734–1735 (source)

“Language brings with it an identity and a culture, or at least the perception of it.”*…

Liz Tracey on Noah Webster’s American Dictionary of the English Language— and the way that it declared Americans free from the tyranny of British institutions and their vocabularies…

Sometimes, a dictionary is more than just words and definitions—it may be intended to serve as a declaration of linguistic independence. When Noah Webster’s first edition of the American Dictionary of the English Language was published in April 1828, it held 70,000 words, 12,000 of which were making their first appearance in dictionary form. Webster’s goals for the work were grand: “to furnish a standard of our vernacular tongue, which we shall not be ashamed to bequeath to three hundred millions of people, who are destined to occupy, and I hope, to adorn the vast territory within our jurisdiction.”

Noah Webster’s roles in the formation of the early United States were manifold: editor of the Federalist Papers, owner and editor of the first American daily newspaper [see below], textbook author, a founder of Amherst College, promoter of the first US copyright laws, and author of one of the first works on epidemiology, used by nineteenth-century medical schools.

But his 1828 dictionary is what he’s remembered for, coming at a tremendous personal cost: twenty-one years invested, and a lifelong struggle with debt. In his preface to the three-volume work, he writes of his hopes that the dictionary will result in his fellow Americans’ “improvement and their happiness; and for the continued increase of the wealth, the learning, the moral and religious elevation of character, and the glory of my country.”…

More at: “Webster’s Dictionary 1828: Annotated,” from @liztracey in @JSTOR_Daily.

* Trevor Noah, Born a Crime

###

As we vindicate vernacular, we might recall that it was on this date in 1846 that the first edition of the Cambridge Chronicle was published. One of the earliest weeklies in the U.S., it served the newly-incorporated city of Cambridge, MA– using language consistent with Webster’s dictionary. (Nearby Boston was home to the first U.S. newspaper, the Publick Occurrences Both Forreign and Domestick, which was founded in 1690 (albeit short-lived).

The Cambridge Chronicle is technically the longest continuously-published weekly newspaper in the U.S… though it ceased original serving up original content in 2022, after being purchased by Gannett. It now re-publishes regional stories from other Gannett papers.

As for Webster, he began his journalistic career in 1779, writing articles for New England newspapers justifying the Revolutionary War. In 1793, Alexander Hamilton recruited him to edit the leading Federalist Party newspaper; then in December of that year, Webster founded New York’s (and the new American nation’s) first daily newspaper American Minerva, later renamed the Commercial Advertiser, which he edited for four years (writing the equivalent of 20 volumes of articles and editorials).

Frontpage of the first edition, May 7, 1846 (source)

“Humanity’s 21st century challenge is to meet the needs of all within the means of the planet”*…

One evening in December, after a long day working from home, Jennifer Drouin, 30, headed out to buy groceries in central Amsterdam. Once inside, she noticed new price tags. The label by the zucchini said they cost a little more than normal: 6¢ extra per kilo for their carbon footprint, 5¢ for the toll the farming takes on the land, and 4¢ to fairly pay workers. “There are all these extra costs to our daily life that normally no one would pay for, or even be aware of,” she says.

The so-called true-price initiative, operating in the store since late 2020, is one of dozens of schemes that Amsterdammers have introduced in recent months as they reassess the impact of the existing economic system. By some accounts, that system, capitalism, has its origins just a mile from the grocery store. In 1602, in a house on a narrow alley, a merchant began selling shares in the nascent Dutch East India Company. In doing so, he paved the way for the creation of the first stock exchange—and the capitalist global economy that has transformed life on earth. “Now I think we’re one of the first cities in a while to start questioning this system,” Drouin says. “Is it actually making us healthy and happy? What do we want? Is it really just economic growth?”

In April 2020, during the first wave of COVID-19, Amsterdam’s city government announced it would recover from the crisis, and avoid future ones, by embracing the theory of “doughnut economics.” Laid out by British economist Kate Raworth in a 2017 book, the theory argues that 20th century economic thinking is not equipped to deal with the 21st century reality of a planet teetering on the edge of climate breakdown. Instead of equating a growing GDP with a successful society, our goal should be to fit all of human life into what Raworth calls the “sweet spot” between the “social foundation,” where everyone has what they need to live a good life, and the “environmental ceiling.” By and large, people in rich countries are living above the environmental ceiling. Those in poorer countries often fall below the social foundation. The space in between: that’s the doughnut.

Amsterdam’s ambition is to bring all 872,000 residents inside the doughnut, ensuring everyone has access to a good quality of life, but without putting more pressure on the planet than is sustainable. Guided by Raworth’s organization, the Doughnut Economics Action Lab (DEAL), the city is introducing massive infrastructure projects, employment schemes and new policies for government contracts to that end. Meanwhile, some 400 local people and organizations have set up a network called the Amsterdam Doughnut Coalition—managed by Drouin— to run their own programs at a grassroots level

You’ve heard about “doughnut economics,” a framework for sustainable development; now one city, spurred by the pandemic, is putting it to the test: “Amsterdam Is Embracing a Radical New Economic Theory to Help Save the Environment. Could It Also Replace Capitalism?

Kate Raworth, originator of the Doughnut Economics framework

###

As we envisage equipoise, we might recall that it was on this date in 1791 that President George Washington signed the Congressional legislation creating the “The President, Directors and Company, or the Bank of the United States,” commonly known as the First Bank of the United States. While it effectively replaced the Bank of North America, the nation’s first de facto central bank, it was First Bank of the United States was the nation’s first official central bank.

The Bank was the cornerstone of a three-part expansion of federal fiscal and monetary power (along with a federal mint and excise taxes) championed by Alexander Hamilton, first Secretary of the Treasury– and strongly opposed by Thomas Jefferson and James Madison, who believed that the bank was unconstitutional, and that it would benefit merchants and investors at the expense of the majority of the population. Hamilton argued that a national bank was necessary to stabilize and improve the nation’s credit, and to improve handling of the financial business of the United States government under the newly enacted Constitution.

History might suggest that both sides were correct.

source

“We shape our tools and thereafter our tools shape us”*…

 

visicalc

 

By the late 1970s, workers on Wall Street were already using rudimentary email processes, putting them among the first to adopt personal computers outside of the sciences, academia, and home hobbyists, according to technologist David Wolfe. But finance’s love affair with computers really took off in the early ‘80s when spreadsheets arrived, and firms began providing in-house employee training for this tool—one that, even today, surprisingly few of us feel comfortable with.

At the time, those groundbreaking programs included VisiCalc—the first-ever digital spreadsheet, and “the ‘killer app’ for the Apple II,” [technologist David] Wolfe said—along with Lotus 1-2-3, which offered expanded capabilities in some areas, and similarly boosted IBM’s PCs.

According to Wolfe, co-director of the Innovation Policy Lab at the University of Toronto’s Munk School of Global Affairs and Public Policy, “The spreadsheet immediately started getting picked up by the financial services industry for its ability to do ‘what if’ calculations, like: If the rate changes from 1% to 2% percent, how will it affect my investment capital?”

Almost immediately, Wall Street also started using the technology to create new, more complex kinds of trading and investments. “It became an incredible time saver-tool, but also started to play into the creation of derivatives,” Wolfe explained…

Let it Visi-snow: “How the Invention of Spreadsheet Software Unleashed Wall Street on the World.”

* Father John Culkin, SJ (though often attributed to his friend Marshall McLuhan)

###

As we copy and paste, we might send expansionary birthday greetings to Jean-Baptiste Colbert; he was born on this date in 1619.  Minister of Finances of France from 1661 to 1683 under the rule of King Louis XIV, Colbert pursued dirigiste policies (those of a strong, directive state, e.g., tariffs, proactive industrial policy) to create a favorable balance of trade and to increase France’s colonial holdings and foreign market access.  His policies inspired those of Alexander Hamilton, the first treasury secretary of the United States and foundational architect of the U.S. national economy.

Colbert1666 source

 

 

Written by (Roughly) Daily

August 29, 2019 at 1:01 am

“Economic theory is the art of pulling a rabbit out of a hat right after you’ve stuffed it into the hat in full view of the audience”*…

 

economic-crisis

Many critics were disappointed the 2008 crisis did not lead to an intellectual revolution on the scale of the 1930s. But the image of stasis you’d get from looking at the top journals and textbooks isn’t the whole picture — the most interesting conversations are happening somewhere else. For a generation, leftists in economics have struggled to change the profession, some by launching attacks (often well aimed, but ignored) from the outside, others by trying to make radical ideas parseable in the orthodox language. One lesson of the past decade is that both groups got it backward. Keynes famously wrote that “Practical men who believe themselves to be quite exempt from any intellectual influence, are usually the slaves of some defunct economist.” But in recent years the relationship seems to have been more the other way round. If we want to change the economics profession, we need to start changing the world. Economics will follow.

From J.W. Mason‘s helpful survey of economic thought since the Crash of 2008: “How a Decade of Crisis Changed Economics.”

* Joan Robinson

###

As we contemplate counting, we might send revolutionary birthday greetings to Alexander Hamilton; he was born on this date in 1755 (or 1757, there is some scholarly debate about the year, but not the date).  A Founding Father of the United States, Hamilton created the Federalist Party (proponent of stronger national government than provided by the Articles of Confederation), the United States Coast Guard, and the New York Post newspaper.  But he was probably most notably the creator of the new nation’s financial system.  The main author of the economic policies of George Washington’s administration, he took the lead in the Federal government’s funding of states’ debts, and established a national bank, a system of tariffs, and friendly trade relations with Britain.  His vision included a strong central government led by a vigorous executive branch, a strong commercial economy, a national bank supporting manufacturing, and a strong military…. in all of which he stood most frequently opposed to Thomas Jefferson, who favored agrarian and small government policies.

220px-alexander_hamilton_portrait_by_john_trumbull_1806 source

 

Written by (Roughly) Daily

January 11, 2019 at 1:01 am