(Roughly) Daily

Posts Tagged ‘adding machine

“Wealth does not consist in money or in gold and silver, but in what money purchases”*…

For millennia, simple forms of record-keeping have been used as ways to keep track of debt, to substitute for the contemporaneous conveyance of specie, or to accommodate the future settlement and netting of debts. In England, tally sticks were regularly used. From Paolo Zannoni, an excerpt from his book, Money and Promises, via Richard Vague and his invaluable Delancey Place

A tally is usually a stick, or a bone, or a piece of ivory — some kind of artefact — that is used to record information. Palaeolithic tallies include the Lembombo bone, found in the Lembombo Mountains in southern Africa, reported to date from around 44,000 BC; the Ishango bone, which consists of the fibula of a baboon, from the Democratic Republic of the Congo (the former Belgian Congo), thought to be 20,000 years old; and the so-called Wolf bone, discovered in Czechoslovakia during excavations at Vestonice, Moravia, in the 1930s, and estimated to be around 30,000 years old. Marked with notches and symbols, these tallies are ancient recording devices, means of data storage and communication. Not merely artefacts, they are important historical documents.

In England, from around the twelfth century, and for over 600 years, tallies became important financial instruments, a key part of public finance and an answer to a perennial problem for money-lenders, merchants and those involved in commerce and trade: how to both facilitate and record the exchange of goods, services and commodities. Reading these English tallies, understanding their history and their changing use, provides us with an understanding not only of the nature of individual financial transactions during the late medieval and early modern period, but also of the development of banking practices in England and its relationship to the English state.

Usually made of willow or hazelwood, tallies were used to record the key information of a financial exchange. The name of the parties involved, the specific trade and the date were written on each side of a stick. Notches of different sizes — which stood for pounds, shillings, and pence — were also cut on both sides. Then the stick was split in two along its length, creating a unique jagged edge; only those two pieces could ever fit perfectly together again. When someone presented one side as proof of a transaction, the parties could check for the right fit.

The potential uses for such a simple tool are obvious.

To begin with: an example of the early use of tallies as a record of debt repayment. John D’Abernon was the Sheriff of Surrey. His portrait in brass, in Stoke D’Abernon Church, Cobham, shows him as a knight in full armour, wielding a broadsword.

When he died, D’Abernon left his title, possessions and debts to his son, also named John. In 1293, we know that John D’Abernon gave two pounds and ten shillings to the Exchequer to pay a fine on behalf of his father. How do we know? Because at the time of payment, the official tally cutter made a series of notches on a stick: two cuts for the two pounds and one smaller notch for the ten shillings. The stick was then split, with the longer end going to John, and the shorter end staying with the Exchequer. The following words were inscribed on both sides: ‘From John D’Abernon for his father’s fine’ and ‘XXI year of the King Edward’.

John could thus prove to anyone that he had paid the fine of his father — simple and convenient.

Tallies also enabled the functioning of the tax system in medieval England, which was a rather more complex affair. The process took months to complete. It worked roughly like this. Tax receivers collected
revenues from the King’s subjects at Easter. They then passed them on to the Exchequer, which completed an audit in late September or early October. At the time, the Exchequer had two branches: the Lower and the Higher. The Lower Exchequer received and disbursed the revenues. The Higher Exchequer audited the process. They used tallies to track who had paid whom. As soon as the Lower Exchequer received the revenues, the tally cutter recorded the payment on the tally and split the stick. The tax receiver — the debtor — got the longer part, called the ‘stock’. The Exchequer — the creditor — kept the short end of the stick, called the ‘foil’. And once a year, at Michaelmas, the Higher Exchequer audited the whole process by matching stocks and foils. The stock was the proof that the collector had not merely pocketed the tax revenues.

Over time, both the use and appearance of the tallies began to change: in the early years, tallies were 3 to 5 inches long; later, they grew to be 1 to 2 feet long, and sometimes much longer. More money meant more notches; more notches, in turn, required longer sticks. One of the last issues of tallies made by the English Exchequer was in 1729, for £50,000: the tally is a whopping 8 feet, 5 inches long, visible proof of the growth of public spending, taxation and inflation.

As the appearance of the tallies changed, so too did their uses. Inside the Exchequer, they served as receipts for money paid by taxpayers. Outside the Exchequer, they began to be put to entirely different purposes.

The business of the Exchequer simply could not work without the tally sticks. They were essential for auditing and controlling public finances, which obviously made them excellent collateral for a loan.

The tally was not a mere generic promise to pay, but a strong, unique claim on the proceeds of the Exchequer’s revenue stream. It identified the cashflow and the individual in charge of paying; the creditor gave the stock to the indicated tax receiver to get coins from a specific revenue stream, and a lender was sure to get his coins sooner or later. The humble English tally stick was therefore ripe to become a veritable public debt security, not merely a receipt. They functioned just like paper public debt securities, except instead of being written on paper, the transactions were instantiated and inscribed on sticks.

To take an early example: Richard de la Pole was a merchant who traded wool, wine and corn with France and central Europe in the early 1300s. He had a reputation for using debts aggressively to grow his business, which appealed to King Edward III and his advisors, who thought they might be able to make use of his skills. So, they appointed him Royal Butler. The job of butler was to supply all sorts of goods — food, wine and arms — to the royal household and to the army. We know that in 1328 Richard bought some wine from the French. As a good businessman, as Royal Butler, did he pay for the wine in coins? He did not. Rather, in order to pay the bill, the Lower Exchequer cut eight tallies, which were addressed to the collectors of taxes for West Riding in Yorkshire, listing the tax revenues earmarked to settle the debt. The Lower Exchequer gave the foils — one half of all the eight tallies — to Richard, who handed them to the merchants who sold him the wine. The merchants then exchanged the tallies with coins from the taxes paid in West Riding, and finally, a few months later, the Higher Exchequer called upon the tax receivers to account for the shortfall of cash, whereupon they presented the eight foils, which had been first given to Richard, as proof of the payments made. 

To be clear: unlike coins, tallies did not actually settle debt. By accepting a foil, a vendor was effectively agreeing to a delayed payment from the Exchequer; the tally was a kind of guarantee that they would get coins. For the state, meanwhile, the tally was a convenient way to borrow from its suppliers, or a form of what we would now call vendor financing — the citizens and merchants who sold goods and services for tallies were effectively financing the state, in much the same way as those who lent actual coins to the Exchequer…

How record-keeping became finance: “Tally Sticks for Money,” via @delanceyplace.

Having looked back, we’d do well to heed Jack Weatherford‘s admonition (in his 1997 book The History of Money):

As money grows in importance, a new struggle is beginning for the control of it in the coming century. We are likely to see a prolonged era of competition during which many kinds of money will appear, proliferate, and disappear in rapidly crashing waves. In the quest to control the new money, many contenders are struggling to become the primary money institution of the new era…

* Adam Smith

###

As we contemplate currency, we might recall that it was on this date in 1888 that William Seward Burroughs of St. Louis, Missouri, received patents on four adding machine applications (No. 388,116-388,119), the first U.S. patents for a “Calculating-Machine” that the inventor would continue to improve and successfully market– largely to businesses and financial institutions.  The American Arithmometer Corporation of St. Louis, later renamed The Burroughs Corporation, became– with IBM, Sperry, NCR, Honeywell, and others– a major force in the development of computers.  Burroughs also gifted the world his grandson, Beat icon William S. Burroughs.

 source

Written by (Roughly) Daily

August 21, 2024 at 1:00 am

“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it”*…

It’s very hard, historian of science Benjamin Breen explains, to understand the implications of a scientific revolution as one is living through it…

2023 is shaping up to be an important year in the history of science. And no, I’m not talking about the reputed room-temperature semiconductor LK-99, which seems increasingly likely to be a dud.

Instead, I’m talking about the discoveries you’ll find in Wikipedia’s list of scientific advances for 2023. Here are some examples:

• January: Positive results from a clinical trial of a vaccine for RSV; OpenAI’s ChatGPT enters wide use.

February: A major breakthrough in quantum computing; announcement of a tiny robot that can clean blood vessels; more evidence for the ability of psychedelics to enhance neuroplasticity; major developments in biocomputers.

• March: OpenAI rolls out GPT-4; continued progress on mRNA vaccines for cancer.

• April: NASA announces astronaut crew who will orbit the moon next year; promising evidence for gene therapy to fight Alzheimer’s.

• May: Scientists use AI to translate brain activity into written words; promising results for a different Alzheimer’s drug; human pangenome sequenced (largely by a team of UCSC researchers — go Banana Slugs!); more good news about the potential of mRNA vaccines for fighting cancer.

And skipping ahead to just the past two weeks:

• nuclear fusion ignition with net energy gain was achieved for the second time

• a radical new approach to attacking cancer tumors entered Phase 1 trials in humans

• and — announced just as I was writing this [in August, 2023] — one of the new crop of weight loss drugs was reported to cut rates of heart attack and stroke in high-risk individuals by 20% (!).

Also in January of 2023: the New York Times asked “What Happened to All of Science’s Big Breakthroughs?”

The headline refers to an article published in Nature which argues that there has been a steady drop in “disruptive” scientific and technological breakthroughs between the years of 1945 and 2010. Basically, it’s a restatement of the concept of a “Great Stagnation” which was proposed by the economist Tyler Cowen in 2011. Though the paper cites everyone from Cowen to Albert Einstein and Isaac Newton, it’s worth noting that it doesn’t cite a single historian of science or technology (unless Alexandre Koyré counts)…

Naturally, as a historian of science and medicine, I think that there really are important things to learn from the history of science and medicine! And what I want to argue for the rest of this post boils down to two specific lessons from that history:

  1. People living through scientific revolutions are usually unaware of them — and, if they are, they don’t think about them in the same way that later generations do.
  2. An apparent slowdown in the rate of scientific innovation doesn’t always mean a slowdown in the impacts of science. The history of the first scientific revolution — the one that began in the famously terrible seventeenth century — suggests that the positive impacts of scientific innovation, in particular, are not always felt by the people living throughthe period of innovation. Periods when the pace of innovation appears to slow down may also be eras when society becomes more capable of benefitting from scientific advances by learning how to mitigate previously unforeseen risks.

[… There follows a fascinating look back at the 1660s– the “original” scientific revolution– at Boyle, Newton, at what they hoped/expected, and at how that differed for what their work and that of their colleagues actually yielded. Then the cautionary tale of Thomas Midgley..]

As we appear to be entering a new era of rapid scientific innovation in the 2020s, it is worth remembering that it often takes decades before the lasting social value of a technical innovation is understood — and decades more before we understand its downsides.

In the meantime, I’m pretty psyched about the cancer drugs…

As Thomas Kuhn observed, “The historian of science may be tempted to exclaim that when paradigms change, the world itself changes with them.”

On the difficulty of knowing the outcomes of a scientific revolution from within it: “Experiencing scientific revolutions: the 1660s and the 2020s,” from @ResObscura.

* Max Planck

###

As we try to see, we might spare a thought for William Seward Burroughs; he died on this date in 1898. And inventor who had worked in a bank, he invented the world’s first commercially viable recording adding machine and pioneered of its manufacture. The very successful company that he founded went on to become Unisys, which was instrumental in the development of computing… the implications of which we’re still discovering– and Burroughs surely never saw.

Nor, one reckons, did he imagine that his grandson, William Seward Burroughs II, would become the cultural figure that he did.

source

“Prediction is very difficult, especially if it’s about the future”*…

 

Robot-assisted farming

It’s easy to chuckle at the prognostications of yore– where’s my jet pack?!?  But as long-time readers will recall, there was one writer whose predictions were uncannily on the money:  Jules Verne.

His Paris in the 20th Century, for example, describes air conditioning, automobiles, the Internet, television, even electricity, and other modern conveniences very similar to their real world counterparts, developed years– in many cases, decades– later.   From the Earth to the Moon, apart from using a space gun instead of a rocket, is uncannily similar to the real Apollo Program: three astronauts are launched from the Florida peninsula– from “Tampa Town” ( only 130 miles from NASA’s Cape Canaveral)– and recovered through a splash landing.  And in other works, he predicted helicopters, submarines, projectors, jukeboxes, and the existence of underwater hydrothermal vents that were not invented/discovered until long after he wrote about them.

Verne’s writings caught the imagination of his countrymen.  As Singularity Hub reports,

Starting in 1899, a commercial artist named Jean-Marc Côté and other artists were hired by a toy or cigarette manufacturer to create a series of picture cards as inserts, according to Matt Noval who writes for the Smithsonian magazine. The images were to depict how life in France would look in a century’s time, no doubt heavily influenced by Verne’s writings. Sadly, they were never actually distributed. However, the only known set of cards to exist was discovered by Isaac Asimov, who wrote a book in 1986 called “Futuredays” in which he presented the illustrations with commentary…

In what some French people might consider an abomination, one illustration depicted the modern kitchen as a place of food science. While synthetic food in commercial products is sadly more common today than we’d like to admit (sorry Easy Cheese lovers, but I’m calling you out), the rise of molecular gastronomy in fine dining has made food chemistry a modern reality. It may seem like food science has its limitations, but one only needs to consider efforts to grow meat in a laboratory to see how far technology may go…

“Food Science”

See them all at “19th Century Artists Predicted the Future in This Series of Postcards.”

[A re-post, inspired by this piece in Upworthy.]

* Niels Bohr

###

As we console ourselves that, while the future may be another country, we may still speak the language, we might recall that it was on this date in 1888 that William Seward Burroughs of St. Louis, Missouri, received patents on four adding machine applications (No. 388,116-388,119), the first U.S. patents for a “Calculating-Machine” that the inventor would continue to improve and successfully market.  The American Arithmometer Corporation of St. Louis, later renamed The Burroughs Corporation, became– with IBM, Sperry, NCR, Honeywell, and others– a major force in the development of computers.  Burroughs also gifted the world his grandson, Beat icon William S. Burroughs.

 source

 

Written by (Roughly) Daily

August 21, 2016 at 1:01 am