(Roughly) Daily

“Do I rue a life wasted doing crosswords? Yes, but I do know the three-letter word for ‘regret'”*…

F. Gregory Hartswick, an early author of crossword books

Efforts to diversify the crossword puzzle industry might be having the opposite effect. As Matt Hartmann explains, although puzzles are an increasingly important part of The New York Times’ and others’ business strategies, only a handful of people actually make a living from crosswords…

The conspiracy theory writes itself. Start looking, and you’ll notice how many New York Times crossword puzzles are co-constructed (the preferred term for what most people would refer to as co-written) by a professional crossword constructor and someone with a day job—it’s hard not to see all the artists, web developers, professors, and other titles that imply a degree of wealth and elite connections. As the pandemic handed the work-from-home class extra time for their hobbies, the number of first-timers published in the Times has skyrocketed. Obviously, rich people are paying others to get the glory of their name in ink.

But the theory is almost diametrically wrong. It turns out the crossword industry really does consist of earnest wordplay lovers donating their time to unpaid mentorships, generally as part of an industry-wide effort to bring new and underrepresented people into crosswords.

Unfortunately, the end result might be even more exclusive than a pay-to-play scheme. And a game that brings the Times at least one million monthly subscribers—at $1.25 a week or $40 for a year—provides a sustainable living wage for shockingly few people…

Learn why at “Inside the Elite, Underpaid, and Weird World of Crossword Writers,” from @themhartman in @newrepublic.

Robert Breault

###

As we fill in the blanks, we might recall that it was on this date in 1977 that South Park premiered on Comedy Central– where it runs to this day. The animated saga of Stan, Kyle, Eric, and Kenny and their exploits in their (titular) Colorado hometown has won five Emmys and a Peabody Award. A theatrical film, South Park: Bigger, Longer & Uncut, was released in June, 1999 to commercial and critical success, and scored an Academy Award nomination.

source

Written by (Roughly) Daily

August 13, 2022 at 1:00 am

“Can’t have dirty garbage!”*…

Rebecca Alter, with a paean to an unexpected TikTok delight…

At some point earlier this year, my For You Page changed for the better. Between cute boys making sandwiches, Brian Jordan Alvarez videos, and American Girl Doll memes, I started getting the occasional video from @nycsanitation. I don’t think I’ve ever watched through a full video on TikTok from any government department, local or federal, but @nycsanitation has clawed its way through algorithms and attention spans to be that rarest of finds: an official organization or company account that’s actually good. The Department comes across in its TikToks as a bunch of genuine, hardworking salt-of-the-earth folks. I mean that literally; @nycsanitation TikTok reminds us that they’re the ones in charge of salting the streets in winter…

Read on for wondrous examples featuring googly-eyed snowplow trucks and earnest charm: “The Department of Sanitation Has an Oddly Excellent TikTok,” from @ralter in @Curbed.

* Spongebob

###

As we keep it clean, we might might recall that it was on this date in 1865 that Joseph Lister, a student of Pasteur’s germ theory, performed the first successful antiseptic surgery (using carbolic acid to disinfect a compound fracture suffered by an 11-year-old boy). After four days, he discovered that no infection had developed, and after a total of six weeks he was amazed to discover that the boy’s bones had fused back together, without suppuration. He subsequently published his results in The Lancet in a series of six articles, running from March through July 1867.

Lister developed his approach to extend to Lister instructing surgeons under his responsibility to wear clean gloves and wash their hands before and after operations with five per cent carbolic acid solutions. Instruments were also washed in the same solution, and assistants sprayed the solution in the operating room.

At first, his suggestions were criticized: germ theory was in its infancy and his techniques were deemed too taxing. But his results– sharp reduction in post-op infection and death– ultimately carried the day. Indeed, he so revolutionized his field that he is known as “father of modern surgery.”

source

“My favorite food from my homeland is Guinness. My second choice is Guinness. My third choice – would have to be Guinness.”*…

As Will O’Brien explains, Ireland’s most famous brewery has been ahead of the curve for 250 years…

Taken over its entire history, Guinness may just be the most successful company Ireland has ever produced. In 1930, it was the seventh largest company in Britain or Ireland. It is one of our oldest companies of note. Considering that it predates the Bank of Ireland and the State itself, it could even be said that Guinness is the longest-running successful large institution in Ireland.

The key to Guinness’ robustness has been innovation. Through a series of key innovations, Guinness was able to stay on top despite (among other things) a famine, mass emigration, two World Wars, a civil war, and the changeover from British to sovereign rule. Guinness is responsible for changes in workplace relations, several foundational advances in the physics of brewing, and even the famous Student’s t-test in statistics. Indeed, Guinness has been one of the key drivers of innovation in Ireland.

A determined founder began Guinness with a vision and took a bold decision with a 9000-year lease. The company then started a brewery which defied nearly every norm in workplace relations. They used the scientific method to radically rethink how beer is brewed and served, and created a world-class brand & marketing operation.

When Guinness released a subtly different pint glass several years ago, traditionalists decried it as blasphemous. The irony is that the brewery that creates this drink has eschewed tradition for over 250 years…

Lessons are where one finds them: “No Great Stagnation in Guinness,” from @willobri.

* Peter O’Toole

###

As we contemplate continuity, we might recall that it was on this date in 1903 that the first U.S. patent for instant coffee (No. 735,777) was issued to Satori Kato of Chicago, Illinois. The application was filed in April of 1901, when his Kato Coffee Company introduced the product at the Pam-American Exposition in Buffalo.

A brochure for the Kato Coffee Company, from the 1901 Pan-American Exposition

source

Written by (Roughly) Daily

August 11, 2022 at 7:59 am

“‘I wish it need not have happened in my time,’ said Frodo. ‘So do I,’ said Gandalf, ‘and so do all who live to see such times. But that is not for them to decide. All we have to decide is what to do with the time that is given us.'”*…

A couple of weeks ago, (Roughly) Daily took a look at the fall of neoliberalism. What’s to come? The estimable Noah Smith has a suggestion…

For years now, I’ve been thinking about what the next big organizing principle of U.S. political economy will be. By “political economy” here I mean the type of economic policies we carry out, and the ways that we expect those policies to reshape our economy. This will be the first in a series of posts laying out my predictions for what the new paradigm will look like.

From the late 1970s through the middle of the 2000s, our organizing principle was what some people call “neoliberalism” — deregulation, tax cuts, free trade, and the shift of the welfare state towards in-kind benefits and work requirements. The reasons we went down this road were complex, and the results were mixed. This replaced an earlier paradigm that people called “the New Deal”, which started to emerge during the Great Depression but really solidified during and just after WW2. That paradigm involved large-scale government investment, heavy regulation, high taxes, social insurance, and the encouragement of a corporate welfare state.

Ever since the financial crisis and the Great Recession of 2008-12, we’ve been looking for a new organizing principle. Obama didn’t really try to give us one; with the exception of Obamacare, he was mostly focused on crisis recovery and damage control (stimulus, financial regulation, boosting the welfare state incrementally along largely neoliberal lines).

But everyone knew a new paradigm was needed. The question was what it would be…

[After carefully considering, then sadly rejecting climate change as a candidate…]

So if it’s not climate change, what will be the thing that forces us to come up with a new policy paradigm? If it’s not the moral equivalent of war, perhaps it’ll be the threat of actual war…

The War Economy,” Part 1

In a second post, he elaborates on how the U.S. and its allies might stack up against a “New Axis.” He dives into relative demographicc, economic, and social strengths, concluding…

I can’t say whether or not the New Axis is the most formidable military competitor that the U.S. and its allies have ever faced. The original Axis was certainly fearsome, and the USSR had tens of thousands of nuclear weapons ready to roast the world at the touch of a button. But I think that the comparisons above show that the New Axis certainly represents an economic competitor like none the U.S. and its allies have ever faced. And the reason is simply China. Russia is mainly a gas station with nukes. But China has three things going for it:

  • China has far, far more workers than the original Axis or the Soviet bloc.
  • China has advanced manufacturing technology that probably rivals the original Axis in relative terms, and far exceeds the Soviet bloc.
  • China has the world’s largest manufacturing cluster, making it the “make everything country”, which neither the Axis nor the USSR managed to be.

He continues…

This is simply a unique situation in modern history. The Industrial Revolution began in Europe and spread to the U.S. and the East Asian rim. The aftermath of WW2 saw central Europe and the East Asian rim incorporated into a U.S.-led alliance that dominated global manufacturing in a way that the communist powers could never threaten. Now, with the rise of China, world manufacturing is divided roughly in two.

Much of the War Economy in the U.S. (and its allies) will therefore be about rediscovering the manufacturing capabilities they neglected during China’s meteoric rise…

The War Economy, Part 2: Sizing up the New Axis

The Brookings Institute recently published its own (and very resonant) assessment of U.S. readiness, “The Sources of Societal Competitiveness.” And Nathan Gardels followed with a trenchant reminder that consensus on national security is a double-edged sword…

In the end, the enduring vitality of any country must be built primarily on the wherewithal within, not on the shaky foundation of menacing adversaries without. George Kennan, architect of the containment strategy against the Soviet Union, understood that lasting vigor comes from the inner confidence of a nation that thrives on its own terms and doesn’t rely on enemies to hold it together. External threats may spur a welcome renewal, but it will remain fragile if that becomes its purpose.

Kennan believed correctly that the West would ultimately be victorious in the Cold War not on some battlefield but through the organic strength of a robust society that no adversary could match.

The same perspective applies today with respect to the challenge of assertive autocracies, especially China. The most important contribution democracies can make to fostering more freedom in the world is to demonstrate through their own institutional integrity and innovations how a governing consensus can be reached by non-authoritarian means.

When Domestic Unity Is Built On Foreign Enemies

We live in interesting times. Eminently worth reading all of the links in full.

* J.R.R. Tolkien, The Fellowship of The Ring

###

As we return to first principles, we might recall that it was on this date in 1945 that the Japanese Foreign Ministry sent telegrams to the Allies (by way of Max Grässli at the Swiss Department of Foreign Affairs ) announcing that Japan would accept the Potsdam Declaration. The surrender of the Empire of Japan was announced by Japanese Emperor Hirohito on 15 August and formally signed on 2 September 1945, bringing the hostilities of World War II to a close.

Japanese Foreign Minister Mamoru Shigemitsu signs the Instrument of Surrender on behalf of the Japanese Government, on board USS Missouri (BB-63), 2 September 1945. Lieutentant General Richard K. Sutherland, U.S. Army, watches from the opposite side of the table. Foreign Ministry representative Toshikazu Kase is assisting Mr. Shigemitsu. Photograph from the Army Signal Corps Collection in the U.S. National Archives.

source

Written by (Roughly) Daily

August 10, 2022 at 1:00 am

“The cyborg would not recognize the Garden of Eden; it is not made of mud and cannot dream of returning to dust.”*…

Here I had tried a straightforward extrapolation of technology, and found myself precipitated over an abyss. It’s a problem we face every time we consider the creation of intelligences greater than our own. When this happens, human history will have reached a kind of singularity — a place where extrapolation breaks down and new models must be applied — and the world will pass beyond our understanding.

Vernor Vinge, True Names and Other Dangers

The once-vibrant transhumanist movement doesn’t capture as much attention as it used to; but as George Dvorsky explains, its ideas are far from dead. Indeed, they helped seed the Futurist movements that are so prominent today (and here and here)…

[On the heels of 9/11] transhumanism made a lot of sense to me, as it seemed to represent the logical next step in our evolution, albeit an evolution guided by humans and not Darwinian selection. As a cultural and intellectual movement, transhumanism seeks to improve the human condition by developing, promoting, and disseminating technologies that significantly augment our cognitive, physical, and psychological capabilities. When I first stumbled upon the movement, the technological enablers of transhumanism were starting to come into focus: genomics, cybernetics, artificial intelligence, and nanotechnology. These tools carried the potential to radically transform our species, leading to humans with augmented intelligence and memory, unlimited lifespans, and entirely new physical and cognitive capabilities. And as a nascent Buddhist, it meant a lot to me that transhumanism held the potential to alleviate a considerable amount of suffering through the elimination of disease, infirmary, mental disorders, and the ravages of aging.

The idea that humans would transition to a posthuman state seemed both inevitable and desirable, but, having an apparently functional brain, I immediately recognized the potential for tremendous harm.

The term “transhumanism” popped into existence during the 20th century, but the idea has been around for a lot longer than that.

The quest for immortality has always been a part of our history, and it probably always will be. The Mesopotamian Epic of Gilgamesh is the earliest written example, while the Fountain of Youth—the literal Fountain of Youth—was the obsession of Spanish explorer Juan Ponce de León.

Notions that humans could somehow be modified or enhanced appeared during the European Enlightenment of the 18th century, with French philosopher Denis Diderot arguing that humans might someday redesign themselves into a multitude of types “whose future and final organic structure it’s impossible to predict,” as he wrote in D’Alembert’s Dream

The Russian cosmists of the late 19th and early 20th centuries foreshadowed modern transhumanism, as they ruminated on space travel, physical rejuvenation, immortality, and the possibility of bringing the dead back to life, the latter being a portend to cryonics—a staple of modern transhumanist thinking. From the 1920s through to the 1950s, thinkers such as British biologist J. B. S. Haldane, Irish scientist J. D. Bernal, and British biologist Julian Huxley (who popularized the term “transhumanism” in a 1957 essay) were openly advocating for such things as artificial wombs, human clones, cybernetic implants, biological enhancements, and space exploration.

It wasn’t until the 1990s, however, that a cohesive transhumanist movement emerged, a development largely brought about by—you guessed it—the internet…

[There follows a brisk and helpful history of transhumanist thought, then an account of the recent past, and present…]

Some of the transhumanist groups that emerged in the 1990s and 2000s still exist or evolved into new forms, and while a strong pro-transhumanist subculture remains, the larger public seems detached and largely disinterested. But that’s not to say that these groups, or the transhumanist movement in general, didn’t have an impact…

“I think the movements had mainly an impact as intellectual salons where blue-sky discussions made people find important issues they later dug into professionally,” said Sandberg. He pointed to Oxford University philosopher and transhumanist Nick Bostrom, who “discovered the importance of existential risk for thinking about the long-term future,” which resulted in an entirely new research direction. The Center for the Study of Existential Risk at the University of Cambridge and the Future of Humanity Institute at Oxford are the direct results of Bostrom’s work. Sandberg also cited artificial intelligence theorist Eliezer Yudkowsky, who “refined thinking about AI that led to the AI safety community forming,” and also the transhumanist “cryptoanarchists” who “did the groundwork for the cryptocurrency world,” he added. Indeed, Vitalik Buterin, a co-founder of Ethereum, subscribes to transhumanist thinking, and his father, Dmitry, used to attend our meetings at the Toronto Transhumanist Association…

Intellectual history: “What Ever Happened to the Transhumanists?,” from @dvorsky.

See also: “The Heaven of the Transhumanists” from @GenofMod (source of the image above).

Donna Haraway

###

As we muse on mortality, we might send carefully-calculated birthday greetings to Marvin Minsky; he was born on this date in 1927.  A biochemist and cognitive scientist by training, he was founding director of MIT’s Artificial Intelligence Project (the MIT AI Lab).  Minsky authored several widely-used texts, and made many contributions to AI, cognitive psychology, mathematics, computational linguistics, robotics, and optics.  He holds several patents, including those for the first neural-network simulator (SNARC, 1951), the first head-mounted graphical display, the first confocal scanning microscope, and the LOGO “turtle” device (with his friend and frequent collaborator Seymour Papert).  His other inventions include mechanical hands and the “Muse” synthesizer.

 source

%d bloggers like this: