Posts Tagged ‘invention’
“Government is an art, not a science, and an adventure, not a planned itinerary”*…
And sometimes, suggests Brian Potter, that adventure is more adventurous than others…
I spend a lot of time reading about manufacturing and its evolution, which means I end up repeatedly reading about the times and places where radical changes in manufacturing were taking place: Britain in the late 18th century, the US in the late 19th and early 20th centuries, Japan in the second half of the 20th century, and (to a lesser extent) China today. I’ve been struck by how many parallels there are between modern China (roughly the period from the late 1970s till today) and the Gilded Age/Progressive era U.S. (roughly the period from the late 1860s to the 1920s).
During these periods, unprecedented levels of economic growth combined with large populations were making both the U.S. and China wealthy and powerful. Both countries were urbanizing, building enormous amounts of infrastructure, and becoming by far the largest manufacturers in the world, with industrial operations of unprecedented size. Both were undergoing wrenching social and cultural change as old institutions were replaced by new ones, and the countries began to become “modern.” Both were nations of ambitious strivers, where it seemed like anyone with talent could make themselves into a success by catching the tide of rising opportunity. Despite the many differences between the two countries, the forces of development pulled them along very similar paths…
[Potter reviews the histories of development in the U.S. and in China…]
… Yuen Yuen Ang [here] likewise notes the similarities between modern China and the Gilded Age U.S., stating that “both countries underwent a wrenching structural conversion from rural to urban and closed to global markets, producing once-in-a-generation opportunities for the politically connected and enterprising…to acquire fabulous wealth.”
The most interesting thing about these parallels, to me, is that the U.S. and China in many ways were starting from very different places. Prior to its opening up, China’s economy was entirely state-owned and state-planned, and its economic expansion was coupled with unwinding much of the state enterprise machinery, letting small businesses form and markets bloom.
The U.S., on the other hand, was on the other end of the spectrum. Prior to its economic expansion it had an incredibly weak state, and economy driven by very small enterprises. Its development was accompanied by the creation of large, powerful companies and institutions, and moving away from the “invisible hand” of the market and towards the “visible hand” of exchanges of goods and services mediated within very large organizations.
China’s success came from finding ways to mobilize its huge number of people and hasn’t necessarily been focused on operating at the frontier of efficiency. The U.S., on the other hand, despite its comparatively large population, had a chronic shortage of labor, and much of its development was focused on developing less labor-intensive manufacturing technologies like the American System. China built its success on the back of inexpensive labor, and it remains a middle-income country. In the U.S., labor has never been cheap; the U.S. had nearly the GDP per capita of Britain as early as the 1820s, and it had the highest GDP per capita in the world by the 1880s. But despite these differences, the logic of development pulled the U.S. and China along very similar paths. Both countries could exploit very large markets (both at home and abroad) and operated their industries at very large scales in order to do so. In both countries, this required a novel set of institutions that was radically different from what came before, and the transformation that created those institutions spawned cultures with many similarities…
“How China Is Like the 19th Century U.S.,” from @_brianpotter (via @ByrneHobart).
One notes that any solution brings its own crop of new problems… another way in which China’s recent history recalls the Gilded Age– and its aftermath.
See also: “The 2024 Nobel Laureates Are Not Only Wrong About China, But Also About the West” from Yuen Yuen Ang, cited above.
Apposite: “The Surprising Resilience of Globalization: An Examination of Claims of Economic Fragmentation” by Brad Setser.
###
As we ponder parallels (lest we wonder if progress accrues during these developmental periods), we might recall that it was on this date in 1904 that Harvey Hubbell received a patent for an invention that changed life in the U.S. and beyond.
In 1888, at the age of 31, Hubbell had quit his job as a manager of a manufacturing company and founded Hubbell Incorporated in Bridgeport, Connecticut, a company which is still in business today, still headquartered near Bridgeport. Hubbell began manufacturing consumer products and, by necessity, inventing manufacturing equipment for his factory. Some of the equipment he designed included automatic tapping machines and progressive dies for blanking and stamping. One of his most important industrial inventions, still in use today, is the thread rolling machine. He quickly began selling his newly devised manufacturing equipment alongside his commercial products.
Hubbell received at least 45 patents, most of which were for electric products. For example, he patented the pull-chain electrical light socket in 1896. But his most famous– and impactful patent was the one he received on this date: the U.S. electrical power plug, which allowed the adoption in the U.S. of convenient, portable electrical devices (which Great Britain had enjoyed since the early 1880s). In 1916, Hubbell was also granted a patent for a three-bladed power plug, including a ground prong.
“Wealth does not consist in money or in gold and silver, but in what money purchases”*…
For millennia, simple forms of record-keeping have been used as ways to keep track of debt, to substitute for the contemporaneous conveyance of specie, or to accommodate the future settlement and netting of debts. In England, tally sticks were regularly used. From Paolo Zannoni, an excerpt from his book, Money and Promises, via Richard Vague and his invaluable Delancey Place…
A tally is usually a stick, or a bone, or a piece of ivory — some kind of artefact — that is used to record information. Palaeolithic tallies include the Lembombo bone, found in the Lembombo Mountains in southern Africa, reported to date from around 44,000 BC; the Ishango bone, which consists of the fibula of a baboon, from the Democratic Republic of the Congo (the former Belgian Congo), thought to be 20,000 years old; and the so-called Wolf bone, discovered in Czechoslovakia during excavations at Vestonice, Moravia, in the 1930s, and estimated to be around 30,000 years old. Marked with notches and symbols, these tallies are ancient recording devices, means of data storage and communication. Not merely artefacts, they are important historical documents.
In England, from around the twelfth century, and for over 600 years, tallies became important financial instruments, a key part of public finance and an answer to a perennial problem for money-lenders, merchants and those involved in commerce and trade: how to both facilitate and record the exchange of goods, services and commodities. Reading these English tallies, understanding their history and their changing use, provides us with an understanding not only of the nature of individual financial transactions during the late medieval and early modern period, but also of the development of banking practices in England and its relationship to the English state.
Usually made of willow or hazelwood, tallies were used to record the key information of a financial exchange. The name of the parties involved, the specific trade and the date were written on each side of a stick. Notches of different sizes — which stood for pounds, shillings, and pence — were also cut on both sides. Then the stick was split in two along its length, creating a unique jagged edge; only those two pieces could ever fit perfectly together again. When someone presented one side as proof of a transaction, the parties could check for the right fit.
The potential uses for such a simple tool are obvious.
To begin with: an example of the early use of tallies as a record of debt repayment. John D’Abernon was the Sheriff of Surrey. His portrait in brass, in Stoke D’Abernon Church, Cobham, shows him as a knight in full armour, wielding a broadsword.
When he died, D’Abernon left his title, possessions and debts to his son, also named John. In 1293, we know that John D’Abernon gave two pounds and ten shillings to the Exchequer to pay a fine on behalf of his father. How do we know? Because at the time of payment, the official tally cutter made a series of notches on a stick: two cuts for the two pounds and one smaller notch for the ten shillings. The stick was then split, with the longer end going to John, and the shorter end staying with the Exchequer. The following words were inscribed on both sides: ‘From John D’Abernon for his father’s fine’ and ‘XXI year of the King Edward’.
John could thus prove to anyone that he had paid the fine of his father — simple and convenient.
Tallies also enabled the functioning of the tax system in medieval England, which was a rather more complex affair. The process took months to complete. It worked roughly like this. Tax receivers collected
revenues from the King’s subjects at Easter. They then passed them on to the Exchequer, which completed an audit in late September or early October. At the time, the Exchequer had two branches: the Lower and the Higher. The Lower Exchequer received and disbursed the revenues. The Higher Exchequer audited the process. They used tallies to track who had paid whom. As soon as the Lower Exchequer received the revenues, the tally cutter recorded the payment on the tally and split the stick. The tax receiver — the debtor — got the longer part, called the ‘stock’. The Exchequer — the creditor — kept the short end of the stick, called the ‘foil’. And once a year, at Michaelmas, the Higher Exchequer audited the whole process by matching stocks and foils. The stock was the proof that the collector had not merely pocketed the tax revenues.Over time, both the use and appearance of the tallies began to change: in the early years, tallies were 3 to 5 inches long; later, they grew to be 1 to 2 feet long, and sometimes much longer. More money meant more notches; more notches, in turn, required longer sticks. One of the last issues of tallies made by the English Exchequer was in 1729, for £50,000: the tally is a whopping 8 feet, 5 inches long, visible proof of the growth of public spending, taxation and inflation.
As the appearance of the tallies changed, so too did their uses. Inside the Exchequer, they served as receipts for money paid by taxpayers. Outside the Exchequer, they began to be put to entirely different purposes.
The business of the Exchequer simply could not work without the tally sticks. They were essential for auditing and controlling public finances, which obviously made them excellent collateral for a loan.
The tally was not a mere generic promise to pay, but a strong, unique claim on the proceeds of the Exchequer’s revenue stream. It identified the cashflow and the individual in charge of paying; the creditor gave the stock to the indicated tax receiver to get coins from a specific revenue stream, and a lender was sure to get his coins sooner or later. The humble English tally stick was therefore ripe to become a veritable public debt security, not merely a receipt. They functioned just like paper public debt securities, except instead of being written on paper, the transactions were instantiated and inscribed on sticks.
To take an early example: Richard de la Pole was a merchant who traded wool, wine and corn with France and central Europe in the early 1300s. He had a reputation for using debts aggressively to grow his business, which appealed to King Edward III and his advisors, who thought they might be able to make use of his skills. So, they appointed him Royal Butler. The job of butler was to supply all sorts of goods — food, wine and arms — to the royal household and to the army. We know that in 1328 Richard bought some wine from the French. As a good businessman, as Royal Butler, did he pay for the wine in coins? He did not. Rather, in order to pay the bill, the Lower Exchequer cut eight tallies, which were addressed to the collectors of taxes for West Riding in Yorkshire, listing the tax revenues earmarked to settle the debt. The Lower Exchequer gave the foils — one half of all the eight tallies — to Richard, who handed them to the merchants who sold him the wine. The merchants then exchanged the tallies with coins from the taxes paid in West Riding, and finally, a few months later, the Higher Exchequer called upon the tax receivers to account for the shortfall of cash, whereupon they presented the eight foils, which had been first given to Richard, as proof of the payments made.
To be clear: unlike coins, tallies did not actually settle debt. By accepting a foil, a vendor was effectively agreeing to a delayed payment from the Exchequer; the tally was a kind of guarantee that they would get coins. For the state, meanwhile, the tally was a convenient way to borrow from its suppliers, or a form of what we would now call vendor financing — the citizens and merchants who sold goods and services for tallies were effectively financing the state, in much the same way as those who lent actual coins to the Exchequer…
How record-keeping became finance: “Tally Sticks for Money,” via @delanceyplace.
Having looked back, we’d do well to heed Jack Weatherford‘s admonition (in his 1997 book The History of Money):
As money grows in importance, a new struggle is beginning for the control of it in the coming century. We are likely to see a prolonged era of competition during which many kinds of money will appear, proliferate, and disappear in rapidly crashing waves. In the quest to control the new money, many contenders are struggling to become the primary money institution of the new era…
* Adam Smith
###
As we contemplate currency, we might recall that it was on this date in 1888 that William Seward Burroughs of St. Louis, Missouri, received patents on four adding machine applications (No. 388,116-388,119), the first U.S. patents for a “Calculating-Machine” that the inventor would continue to improve and successfully market– largely to businesses and financial institutions. The American Arithmometer Corporation of St. Louis, later renamed The Burroughs Corporation, became– with IBM, Sperry, NCR, Honeywell, and others– a major force in the development of computers. Burroughs also gifted the world his grandson, Beat icon William S. Burroughs.

“Some dreams matter. Most don’t. Often it can be hard to know which might be which.”*…
… So it is with trends and the drivers of fundamental change. One of the canniest spotters of emerging dynamics, Matt Klein, wrestles with some raw material…
I’ve always wanted to play with “conference talk tracks” as a data source, and finally found a moment to do a little analysis.
I took all of the 2025 submitted (not yet accepted) talk titles for SXSW’s “2050” & “Culture” tracks, to identify what patterns exist. I did this via Perplexity AI Pro, an advanced prompt, and detailed profile context.
Note: I don’t necessarily think these talk patterns are “cultural trends,” (although we can debate this). Rather, I see these submitted talk patterns as a pulse on what an industry is finding provocative, discussion-worthy and worth peacocking as thought leadership. That’s a different, valuable sort of insight.
I was particularly interested in surfacing not just the largest common denominators of talks (ex. AI, psychedelics, video games), but instead more specific, smaller and unexpected themes.
What makes SXSW unique (and this data worth analyzing) is that these talk submissions are crowdsourced or “bottoms-up” via industry leaders vs. (biased) invited, brand-focused, or sponsored talks, which many other conferences prioritize. Therefore, I find these SXSW-submissions quite organic and a valuable “pulse” on executive, senior leader, and public thinkers’ minds.
With that, 10 themes from hundreds of talk submissions…
01. Commodification of Authenticity
Can authenticity exist online, let alone from a business? Or is “authenticity” a paradox? This cluster explores how the pursuit of genuine experiences and identities is being packaged, sold, and (likely) diluted in the process, raising questions around the nature of authenticity in a hyper-commercial world.
Ex:
- “Authenticity: The New Social Currency for Brands”
- “Honor Your Root: Building Authentic Brands that Connect”
- “Authenticity + Impact: Native Voices in Film, Food & Beyond”
The other nine at: “10 Patterns from 400+ SXSW ’25 Talk Submissions,” from @KleinKleinKlein.
* Dean Koontz, Saint Odd
###
As we sift for significance, we might send lasting birthday greetings to Hazel Bishop; she was born on this date in 1906. An aspiring doctor, she was forced to drop out of medical school during the Great depression, and instead put her undergraduate degree in chemistry to work. As a senior organic chemist with Standard Oil during World War II, she discovered the cause of deposits affecting superchargers of aircraft engines. Later, in 1949, after a long series of home experiments building on that earlier discovery, in a kitchen fitted out as a laboratory, she perfected a “kiss-proof” “No-Smear Lipstick” that stayed on the lips longer than any other product then available, and began its manufacture. Marketed as the lipstick that “stays on you not on him,” it changed the cosmetic industry, spurring a raft on imitators.

“We ceased to be the lunatic fringe. We’re now the lunatic core.”*…
Further, in a fashion, to yesterday’s post on analog computing, an essay from Benjamin Labatut (the author of two remarkable works of “scientific-historical fiction,” When We Cease to Understand the World and The MANIAC, continuing the animating theme of those books…
We will never know how many died during the Butlerian Jihad. Was it millions? Billions? Trillions, perhaps? It was a fantastic rage, a great revolt that spread like wildfire, consuming everything in its path, a chaos that engulfed generations in an orgy of destruction lasting almost a hundred years. A war with a death toll so high that it left a permanent scar on humanity’s soul. But we will never know the names of those who fought and died in it, or the immense suffering and destruction it caused, because the Butlerian Jihad, abominable and devastating as it was, never happened.
The Jihad was an imagined event, conjured up by Frank Herbert as part of the lore that animates his science-fiction saga Dune. It was humanity’s last stand against sentient technology, a crusade to overthrow the god of machine-logic and eradicate the conscious computers and robots that in the future had almost entirely enslaved us. Herbert described it as “a thalamic pause for all humankind,” an era of such violence run amok that it completely transformed the way society developed from then onward. But we know very little of what actually happened during the struggle itself, because in the original Dune series, Herbert gives us only the faintest outlines—hints, murmurs, and whispers, which carry the ghostly weight of prophecy. The Jihad reshaped civilization by outlawing artificial intelligence or any machine that simulated our minds, placing a damper on the worst excesses of technology. However, it was fought so many eons before the events portrayed in the novels that by the time they occur it has faded into legend and crystallized in apocrypha. The hard-won lessons of the catastrophe are preserved in popular wisdom and sayings: “Man may not be replaced.” “Once men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them.” “We do not trust the unknown which can arise from imaginative technology.” “We must negate the machines-that-think.” The most enduring legacy of the Jihad was a profound change in humankind’s relationship to technology. Because the target of that great hunt, where we stalked and preyed upon the very artifacts we had created to lift ourselves above the seat that nature had intended for us, was not just mechanical intelligence but the machinelike attitude that had taken hold of our species: “Humans had set those machines to usurp our sense of beauty, our necessary selfdom out of which we make living judgments,” Herbert wrote.
Humans must set their own guidelines. This is not something machines can do. Reasoning depends upon programming, not on hardware, and we are the ultimate program!
The Butlerian Jihad removed a crutch—the part of ourselves that we had given over to technology—and forced human minds to develop above and beyond the limits of mechanistic reasoning, so that we would no longer depend on computers to do our thinking for us.
Herbert’s fantasy, his far-flung vision of a devastating war between humanity and the god of machine-logic, seemed quaint when he began writing it in the Sixties. Back then, computers were primitive by modern standards, massive mainframe contraptions that could process only hundreds of thousands of cycles per second (instead of billions, like today), had very little memory, operated via punch cards, and were not connected to one another. And we have easily ignored Herbert’s warnings ever since, but now the Butlerian Jihad has suddenly returned to plague us. The artificial-intelligence apocalypse is a new fear that keeps many up at night, a terror born of great advances that seem to suggest that, if we are not very careful, we may—with our own hands—bring forth a future where humanity has no place. This strange nightmare is a credible danger only because so many of our dreams are threatening to come true. It is the culmination of a long process that hearkens back to the origins of civilization itself, to the time when the world was filled with magic and dread, and the only way to guarantee our survival was to call down the power of the gods.
Apotheosis has always haunted the soul of humankind. Since ancient times we have suffered the longing to become gods and exceed the limits nature has placed on us. To achieve this, we built altars and performed rituals to ask for wisdom, blessings, and the means to reach beyond our capabilities. While we tend to believe that it is only now, in the modern world, that power and knowledge carry great risks, primitive knowledge was also dangerous, because in antiquity a part of our understanding of the world and ourselves did not come from us, but from the Other. From the gods, from spirits, from raging voices that spoke in silence.
[Labatut invokes the mysteries of the Vedas and their Altar of Fire, which was meant to develop “a mind, (that) when properly developed, could fly like a bird with outstretched wings and conquer the skies.”…]
Seen from afar by people who were not aware of what was being made, these men and women must surely have looked like bricklayers gone mad. And that same frantic folly seems to possess those who, in recent decades, have dedicated their hearts and minds to the building of a new mathematical construct, a soulless copy of certain aspects of our thinking that we have chosen to name “artificial intelligence,” a tool so formidable that, if we are to believe the most zealous among its devotees, will help us reach the heavens and become immortal…
[Labatut recounts the stories– and works– of some of the creators of AI’s DNA: George Boole (and his logic), Claude Shannon (who put that logic to work), and Geoffrey Hinton (Boole’s great-great-grandson, and “the Godfather of AI,” who created of the first neural networks, but has more recently undergone a change of opinion)…]
… Hinton has been transformed. He has mutated from an evangelist of a new form of reason into a prophet of doom. He says that what changed his mind was the realization that we had, in fact, not replicated our intelligence, but created a superior one.
Or was it something else, perhaps? Did some unconscious part of him whisper that it was he, rather than his great-great-grandfather, who was intended by God to find the mechanisms of thought? Hinton does not believe in God, and he would surely deny his ancestor’s claim that pain is an instrument of the Lord’s will, since he was forced to have every one of his meals on his knees, resting on a pillow like a monk praying at the altar, because of a back injury that caused him excruciating pain. For more than seventeen years, he could not sit down, and only since 2022 has he managed to do so long enough to eat.
Hinton is adamant that the dangers of thinking machines are real. And not just short-term effects like job replacement, disinformation, or autonomous lethal weapons, but an existential risk that some discount as fantasy: that our place in the world might be supplanted by AI. Part of his fear is that he believes AI could actually achieve a sort of immortality, as the Vedic gods did. “The good news,” he has said, “is we figured out how to build things that are immortal. When a piece of hardware dies, they don’t die. If you’ve got the weights stored in some medium and you can find another piece of hardware that can run the same instructions, then you can bring it to life again. So, we’ve got immortality. But it’s not for us.”
Hinton seems to be afraid of what we might see when the embers of the Altar of Fire die down at the end of the sacrifice and the sharp coldness of the beings we have conjured up starts to seep into our bones. Are we really headed for obsolescence? Will humanity perish, not because of the way we treat all that surrounds us, nor due to some massive unthinking rock hurled at us by gravity, but as a consequence of our own irrational need to know all that can be known? The supposed AI apocalypse is different from the mushroom-cloud horror of nuclear war, and unlike the ravages of the wildfires, droughts, and inundations that are becoming commonplace, because it arises from things that we have, since the beginning of civilization, always considered positive and central to what makes us human: reason, intelligence, logic, and the capacity to solve the problems, puzzles, and evils that taint even the most fortunate person’s existence with everyday suffering. But in clawing our way to apotheosis, in daring to follow the footsteps of the Vedic gods who managed to escape from Death, we may shine a light on things that should remain in darkness. Because even if artificial intelligence never lives up to the grand and terrifying nightmare visions that presage a nonhuman world where algorithms hum along without us, we will still have to contend with the myriad effects this technology will have on human society, culture, and economics.
In the meantime, the larger specter of superintelligent AI looms over us. And while it is less likely and perhaps even impossible (nothing but a fairy tale, some say, a horror story intended to attract more money and investment by presenting a series of powerful systems not as the next step in our technological development but as a death-god that ends the world), it cannot be easily dispelled, for it reaches down and touches the fibers of our mythmaking apparatus, that part of our being that is atavistic and fearful, because it reminds us of a time when we shivered in caves and huddled together, while outside in the dark, with eyes that could see in the night, the many savage beasts and monsters of the past sniffed around for traces of our scent.
As every new AI model becomes stronger, as the voices of warning form a chorus, and even the most optimistic among us begin to fear this new technology, it is harder and harder to think without panic or to reason with logic. Thankfully, we have many other talents that don’t answer to reason. And we can always rise and take a step back from the void toward which we have so hurriedly thrown ourselves, by lending an ear to the strange voices that arise from our imagination, that feral territory that will always remain a necessary refuge and counterpoint to rationality.
Faced, as we are, with wild speculation, confronted with dangers that no one, however smart or well informed, is truly capable of managing or understanding, and taunted by the promises of unlimited potential, we may have to sound out the future not merely with science, politics, and reason, but with that devil-eye we use to see in the dark: fiction. Because we can find keys to doors we have yet to encounter in the worlds that authors have imagined in the past. As we grope forward in a daze, battered and bewildered by the capabilities of AI, we could do worse than to think about the desert planet where the protagonists of Herbert’s Dune novels sought to peer into the streaming sands of future time, under the heady spell of a drug called spice, to find the Golden Path, a way for human beings to break from tyranny and avoid extinction or stagnation by being more diverse, resilient, and free, evolving past purely logical reasoning and developing our minds and faculties to the point where our thoughts and actions are unpredictable and not bound by statistics. Herbert’s books, with their strange mixture of past and present, remind us that there are many ways in which we can continue forward while preserving our humanity. AI is here already, but what we choose to do with it and what limits we agree to place on its development remain decisions to be made. No matter how many billions of dollars are invested in the AI companies that promise to eliminate work, solve climate change, cure cancer, and rain down miracles unlike anything we have seen before, we can never fully give ourselves over to these mathematical creatures, these beings with no soul or sympathy, because they are neither alive nor conscious—at least not yet, and certainly not like us—so they do not share the contradictory nature of our minds.
In the coming years, as people armed with AI continue making the world faster, stranger, and more chaotic, we should do all we can to prevent these systems from giving more and more power to the few who can build them. But we should also consider a warning from Herbert, the central commandment he chose to enshrine at the heart of future humanity’s key religious text, a rule meant to keep us from becoming subservient to the products of our reason, and from bowing down before the God of Logic and his many fearsome offspring:
Thou shalt not make a machine in the likeness of a human mind…
Before and after artificial intelligence: “The Gods of Logic” in @Harpers. Eminently worth reading in full.
For a less pessimistic view, see: “A Journey Through the Uncanny Valley: Our Relational Futures with AI,” from @dylanhendricks at @iftf.
* Geoffrey Hinton
###
As we deliberate on Daedalus’ caution, we might we might send fantastically far-sighted birthday greetings to a tecno-optimist who might likely have brushed aside Labatut’s concerns: Hugo Gernsback, a Luxemborgian-American inventor, broadcast pioneer, writer, and publisher; he was born on this date in 1884.
Gernsback held 80 patents at the time of his death; he founded radio station WRNY, was involved in the first television broadcasts, and is considered a pioneer in amateur radio. But it was as a writer and publisher that he probably left his most lasting mark: In 1926, as owner/publisher of the magazine Modern Electrics, he filled a blank spot in his publication by dashing off the first chapter of a series called “Ralph 124C 41+.” The twelve installments of “Ralph” were filled with inventions unknown in 1926, including “television” (Gernsback is credited with introducing the word), fluorescent lighting, juke boxes, solar energy, television, microfilm, vending machines, and the device we now call radar.
The “Ralph” series was an astounding success with readers; and later that year Gernsback founded the first magazine devoted to science fiction, Amazing Stories. Believing that the perfect sci-fi story is “75 percent literature interwoven with 25 percent science,” he coined the term “science fiction.”
Gernsback was a “careful” businessman, who was tight with the fees that he paid his writers– so tight that H. P. Lovecraft and Clark Ashton Smith referred to him as “Hugo the Rat.”
Still, his contributions to the genre as publisher were so significant that, along with H.G. Wells and Jules Verne, he is sometimes called “The Father of Science Fiction”; in his honor, the annual Science Fiction Achievement awards are called the “Hugos.”
(Coincidentally, today is also the birthday– in 1906– of Philo T. Farnsworth, the man who actually did invent television.)
“If you could build a house on a trampoline, that would suit me fine”*…
James Coleman on the next best thing: that staple of kids’ birthday parties, the bouncy castle (and its cousins)…
My son turned 8 years old earlier this month. We decided to host a birthday party in our yard with a bunch of his friends from school. As if creating several hours of entertainment for a crew of rambunctious boys wasn’t stressful enough, a week of heavy rain and an ominous forecast threatened the whole event. I did not want to have that conversation with the excited birthday boy.
Fortunately, the rain subsided just as the primary entertainment was delivered: an inflatable bounce house called The Challenge, which we rented from a local vendor. The kids had a lot of fun, and I did too, eventually, after the stress subsided. Once things wrapped up, I offered to help the vendor go through the labor-intensive process of rolling and storing a 200-kilogram inflatable. I can’t say it was good for my back, but the experience made me curious about the larger industry.
Most sources attribute the invention of the inflatable amusement to John Scurlock in the 1950s. Scurlock, who died in 2008, was an electrical engineer, physics professor, and NASA researcher who specialized in plastics. While designing inflatable covers for tennis courts, he came up with the idea for a “Space Pillow” that children could use for acrobatic play. It was little more than an air-filled bag with protective netting, but later he would use the same basic principle to create safety air cushions for fire-fighters and stunt performers. The Scurlock family still manufactures and rents amusements as Space Walk Inflatables. In 2014, they had two hundred branches and managed roughly 35,000 rentals per year. This would put them on the large end of inflatable amusement rental companies, of which there are thousands in the US.

With $20,000 and a truck, you can start renting inflatables. The low startup costs make it an attractive option for many, and there is no shortage of influencers willing to share basic business plans. But the work is arduous, with most weekends spent in a mad dash to clean and deliver amusements. (Stressed-out parents, like myself, are also no picnic.) Because there are so many small players, it is difficult to get estimates of how much money is being made in the market as a whole. Space Walk officials peg it at around a hundred million dollars annually…
More on how they’re made and how they’re tested at “Notes on Inflatable Amusements,” from @jamestweetz in @the_prepared.
* Alan Rickman
###
As we bounce, we might note that this was a momentous date in the history of another celebratory stalwart; it was on this date in 1995 that “Macarena”– more specifically, “Macarena (Bayside Boys Mix)”– hit #1 on Billboard‘s Hot 100 and remained on the chart for 60 weeks.
The original Los Del Rio recording of “Macarena” was a hit in Latin America but would not have gained much attention in North America if it weren’t for John Caride, a DJ at a Miami radio station. Having watched dancers’ enthusiastic reaction to the song at a club at which he was spinning, Caride wanted to add the tune to his radio playlist, but was refused by his program manager on the grounds that the station (WPOW– “Power 96”) didn’t put foreign language songs into rotation. Caride enlisted producers Carols De Yarza and Mike Triay to re-record the song with English-language verses and then remixed to make it (even more) “club-friendly.” It was this version– “Macarena (Bayside Boys Mix)”– that hit the top of the chart… and became the “No. 1 Greatest One-Hit Wonder of All Time” (per VH-1) and a staple of wedding receptions everywhere.







You must be logged in to post a comment.