(Roughly) Daily

Posts Tagged ‘foresight

“It is difficult to predict, especially the future”*…

An amusing attempt to take the long view…

W. Cade Gall’s delightful “Future Dictates of Fashion” — published in the June 1893 issue of The Strand magazine — is built on the premise that a book from a hundred years in the future (published in 1993) called The Past Dictates of Fashion has been inexplicably found in a library. The piece proceeds to divulge this mysterious book’s contents — namely, a look back at the last century of fashion, which, of course, for the reader in 1893, would be looking forward across the next hundred years. In this imagined future, fashion has become a much respected science (studied in University from the 1950s onwards) and is seen to be “governed by immutable laws”.

The designs themselves have a somewhat unaccountable leaning toward the medieval, or as John Ptak astutely notes, “a weird alien/Buck Rogers/Dr. Seuss/Wizard of Oz quality”. If indeed this was a genuine attempt by the author Gall to imagine what the future of fashion might look like, it’s fascinating to see how far off the mark he was (excluding perhaps the 60s and 70s), proving yet again how difficult it is to predict future aesthetics. It is also fascinating to see how little Gall imagines clothes changing across the decades (e.g. 1970 doesn’t seem so different to 1920) and to see which aspects of his present he was unable to see beyond (e.g. the long length of women’s skirts and the seemingly ubiquitous frill). As is often the case when we come into contact with historic attempts to predict a future which for us is now past, it is as if glimpsing into another possible world, a parallel universe that could have been (or which, perhaps, did indeed play out “somewhere”)…

More at: “Sartorial Foresight: Future Dictates of Fashion (1893)” in @PublicDomainRev.

Browse the original on the Internet Archive.

* Niels Bohr (after a Danish proverb)

###

As we ponder the problem of prognostication, we might recall that it was on this date in 1934 that producer Samuel Goldwyn bought the film rights to L. Frank Baum’s book, The Wonderful Wizard of Oz, which had been a hit since its publication in 1900 but had until then been considered both inappropriate (as it was a “children’s book”) and too hard to film. Goldwyn was banking on the drawing power of his child star Shirley Temple, the original choice for Dorothy; but (as everyone knows) the role went to Judy Garland who won a special “Best Juvenile Performer” Oscar and made the award-winning song, “Somewhere Over the Rainbow” a huge hit.

The film was only a modest box-office success on release… but has of course become a beloved classic.

source

“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it”*…

It’s very hard, historian of science Benjamin Breen explains, to understand the implications of a scientific revolution as one is living through it…

2023 is shaping up to be an important year in the history of science. And no, I’m not talking about the reputed room-temperature semiconductor LK-99, which seems increasingly likely to be a dud.

Instead, I’m talking about the discoveries you’ll find in Wikipedia’s list of scientific advances for 2023. Here are some examples:

• January: Positive results from a clinical trial of a vaccine for RSV; OpenAI’s ChatGPT enters wide use.

February: A major breakthrough in quantum computing; announcement of a tiny robot that can clean blood vessels; more evidence for the ability of psychedelics to enhance neuroplasticity; major developments in biocomputers.

• March: OpenAI rolls out GPT-4; continued progress on mRNA vaccines for cancer.

• April: NASA announces astronaut crew who will orbit the moon next year; promising evidence for gene therapy to fight Alzheimer’s.

• May: Scientists use AI to translate brain activity into written words; promising results for a different Alzheimer’s drug; human pangenome sequenced (largely by a team of UCSC researchers — go Banana Slugs!); more good news about the potential of mRNA vaccines for fighting cancer.

And skipping ahead to just the past two weeks:

• nuclear fusion ignition with net energy gain was achieved for the second time

• a radical new approach to attacking cancer tumors entered Phase 1 trials in humans

• and — announced just as I was writing this [in August, 2023] — one of the new crop of weight loss drugs was reported to cut rates of heart attack and stroke in high-risk individuals by 20% (!).

Also in January of 2023: the New York Times asked “What Happened to All of Science’s Big Breakthroughs?”

The headline refers to an article published in Nature which argues that there has been a steady drop in “disruptive” scientific and technological breakthroughs between the years of 1945 and 2010. Basically, it’s a restatement of the concept of a “Great Stagnation” which was proposed by the economist Tyler Cowen in 2011. Though the paper cites everyone from Cowen to Albert Einstein and Isaac Newton, it’s worth noting that it doesn’t cite a single historian of science or technology (unless Alexandre Koyré counts)…

Naturally, as a historian of science and medicine, I think that there really are important things to learn from the history of science and medicine! And what I want to argue for the rest of this post boils down to two specific lessons from that history:

  1. People living through scientific revolutions are usually unaware of them — and, if they are, they don’t think about them in the same way that later generations do.
  2. An apparent slowdown in the rate of scientific innovation doesn’t always mean a slowdown in the impacts of science. The history of the first scientific revolution — the one that began in the famously terrible seventeenth century — suggests that the positive impacts of scientific innovation, in particular, are not always felt by the people living throughthe period of innovation. Periods when the pace of innovation appears to slow down may also be eras when society becomes more capable of benefitting from scientific advances by learning how to mitigate previously unforeseen risks.

[… There follows a fascinating look back at the 1660s– the “original” scientific revolution– at Boyle, Newton, at what they hoped/expected, and at how that differed for what their work and that of their colleagues actually yielded. Then the cautionary tale of Thomas Midgley..]

As we appear to be entering a new era of rapid scientific innovation in the 2020s, it is worth remembering that it often takes decades before the lasting social value of a technical innovation is understood — and decades more before we understand its downsides.

In the meantime, I’m pretty psyched about the cancer drugs…

As Thomas Kuhn observed, “The historian of science may be tempted to exclaim that when paradigms change, the world itself changes with them.”

On the difficulty of knowing the outcomes of a scientific revolution from within it: “Experiencing scientific revolutions: the 1660s and the 2020s,” from @ResObscura.

* Max Planck

###

As we try to see, we might spare a thought for William Seward Burroughs; he died on this date in 1898. And inventor who had worked in a bank, he invented the world’s first commercially viable recording adding machine and pioneered of its manufacture. The very successful company that he founded went on to become Unisys, which was instrumental in the development of computing… the implications of which we’re still discovering– and Burroughs surely never saw.

Nor, one reckons, did he imagine that his grandson, William Seward Burroughs II, would become the cultural figure that he did.

source

“The future belonged to the showy and the promiscuous”*…

Emily J. Orlando on the enduring relevance and the foresight of Edith Wharton

If ever there were a good time to read the American writer Edith Wharton, who published over forty books across four decades, it’s now. Those who think they don’t know Wharton might be surprised to learn they do. A reverence for Wharton’s fiction informs HBO’s Sex and the City, whose pilot features Carrie Bradshaw’s “welcome to the age of un-innocence.” The CW’s Gossip Girl opens, like Wharton’s The House of Mirth, with a bachelor spying an out-of-reach love interest at Grand Central Station while Season 2 reminds us that “Before Gossip Girl, there was Edith Wharton.”

But why Wharton? Why now? Perhaps it’s because for all its new technologies, conveniences, and modes of travel and communication, our own “Gilded Age” is a lot like hers [see here]. For the post-war and post-flu-epidemic climate that engendered her Pulitzer-Prize-winning novel The Age of Innocence is not far removed from our post-COVID-19 reality. In both historical moments, citizens of the world have witnessed a retreat into conservatism and a rise of white supremacy.

Fringe groups like the “Proud Boys” and “QAnon” and deniers of everything from the coronavirus to climate change are invited to the table in the name of free speech and here Wharton’s distrust of false narratives resonates particularly well. Post-9/11 calls for patriotism and the alignment of the American flag with one political party harken back to Wharton’s poignant questioning, in a 1919 letter, of the compulsion to profess national allegiance:

how much longer are we going to think it necessary to be “American” before (or in contradistinction to) being cultivated, being enlightened, being humane, & having the same intellectual discipline as other civilized countries?

Her cosmopolitan critique of nationalist fervor remains instructive to us today…

Eminently worth reading in full (then picking up one of Wharton’s wonderful novels): “How Edith Wharton Foresaw the 21st Century,” in @lithub.

See also: “These days, the bigger the company, the less you can figure out what it does.”

* Edith Wharton, The Custom of the Country

###

As we prize perspicacity, we might recall that it was on this date in 1884, in the midst of the Gilded Age, that Harper’s Bazaar proclaimed, “…it is not convenable, according to European ideas, to wear a loose flowing robe of the tea-gown pattern out of one’s bedroom or boudoir. It has been done by ignorant people at a watering-place, but it never looks well. It is really an undress, although lace and satin may be used in its composition. A plain, high, and tight-fitting garment is much the more elegant dress for the afternoon teas as we give them.”

Embraced by artists and reformers, the Aesthetic Dress Movement of the 1870s and 1880s was a non-mainstream movement within fashion that looked to the Renaissance and Rococo periods for inspiration. The movement began in response to reformers seeking to call attention to the unhealthy side effects of wearing a corset, thus, the main feature of this movement in women’s dress was the loose-fitting dress, which was worn without a corset. Artists and progressive social reformers embraced the Aesthetic Dress movement by appearing uncorseted and in loose-fitting dresses in public. For many that fell into these categories, Aesthetic Dress was an artistic statement. Appearing in public uncorseted was considered controversial for women, as it suggested intimacy. In fact, many women across the country were arrested for appearing in public wearing Aesthetic costumes, as authorities and more conservative citizens associated this type of dress with prostitution.

But for most wealthy women, the influence of the Aesthetic Dress movement on their wardrobes took the form of the Tea Gowns. Like most dresses that could be considered “Aesthetic,” Tea Gowns were loose and meant to be worn without a corset. However, they were less controversial than the Aesthetic ensembles of more artistic and progressive women. This is because they were not typically worn in public or in the company of the opposite sex. Tea Gowns were a common ensemble for hosts of all-female teas that were held in the wearer’s home. Thus, because no men were in attendance, Tea Gowns were socially acceptable in these scenarios. Mainstream magazines like Harper’s Bazar were not especially keen on the Tea Gown and cautioned their readers not to appear wearing one in public. 

“Gilded Age Fashion”

For a sense of what was at stake, see “The Corset X-Rays of Dr Ludovic O’Followell (1908)

source

Written by (Roughly) Daily

January 26, 2023 at 1:00 am

“Foresight begins when we accept that we are now creating a civilization of risk”*…

There have been a handful folks– Vernor Vinge, Don Michael, Sherry Turkle, to name a few– who were, decades ago, exceptionally foresightful about the technologically-meditated present in which we live. Philip Agre belongs in their number…

In 1994 — before most Americans had an email address or Internet access or even a personal computer — Philip Agre foresaw that computers would one day facilitate the mass collection of data on everything in society.

That process would change and simplify human behavior, wrote the then-UCLA humanities professor. And because that data would be collected not by a single, powerful “big brother” government but by lots of entities for lots of different purposes, he predicted that people would willingly part with massive amounts of information about their most personal fears and desires.

“Genuinely worrisome developments can seem ‘not so bad’ simply for lacking the overt horrors of Orwell’s dystopia,” wrote Agre, who has a doctorate in computer science from the Massachusetts Institute of Technology, in an academic paper.

Nearly 30 years later, Agre’s paper seems eerily prescient, a startling vision of a future that has come to pass in the form of a data industrial complex that knows no borders and few laws. Data collected by disparate ad networks and mobile apps for myriad purposes is being used to sway elections or, in at least one case, to out a gay priest. But Agre didn’t stop there. He foresaw the authoritarian misuse of facial recognition technology, he predicted our inability to resist well-crafted disinformation and he foretold that artificial intelligence would be put to dark uses if not subjected to moral and philosophical inquiry.

Then, no one listened. Now, many of Agre’s former colleagues and friends say they’ve been thinking about him more in recent years, and rereading his work, as pitfalls of the Internet’s explosive and unchecked growth have come into relief, eroding democracy and helping to facilitate a violent uprising on the steps of the U.S. Capitol in January.

“We’re living in the aftermath of ignoring people like Phil,” said Marc Rotenberg, who edited a book with Agre in 1998 on technology and privacy, and is now founder and executive director for the Center for AI and Digital Policy…

As Reed Albergotti (@ReedAlbergotti) explains, better late than never: “He predicted the dark side of the Internet 30 years ago. Why did no one listen?

Agre’s papers are here.

* Jacques Ellul

###

As we consider consequences, we might recall that it was on this date in 1858 that Queen Victoria sent the first official telegraph message across the Atlantic Ocean from London to U. S. President James Buchanan, in Washington D.C.– an initiated a new era in global communications.

Transmission of the message began at 10:50am and wasn’t completed until 4:30am the next day, taking nearly eighteen hours to reach Newfoundland, Canada. Ninety-nine words, containing five hundred nine letters, were transmitted at a rate of about two minutes per letter.

After White House staff had satisfied themselves that it wasn’t a hoax, the President sent a reply of 143 words in a relatively rapid ten hours. Without the cable, a dispatch in one direction alone would have taken rouighly twelve days by the speediest combination of inland telegraph and fast steamer.

source