(Roughly) Daily

Posts Tagged ‘Science Fiction

“Moore’s Law is really a thing about human activity, it’s about vision, it’s about what you’re allowed to believe”*…

 

Karen-fungal-computing-2

 

In moments of technological frustration, it helps to remember that a computer is basically a rock. That is its fundamental witchcraft, or ours: for all its processing power, the device that runs your life is just a complex arrangement of minerals animated by electricity and language. Smart rocks. The components are mined from the Earth at great cost, and they eventually return to the Earth, however poisoned. This rock-and-metal paradigm has mostly served us well. The miniaturization of metallic components onto wafers of silicon — an empirical trend we call Moore’s Law — has defined the last half-century of life on Earth, giving us wristwatch computers, pocket-sized satellites and enough raw computational power to model the climate, discover unknown molecules, and emulate human learning.

But there are limits to what a rock can do. Computer scientists have been predicting the end of Moore’s Law for decades. The cost of fabricating next-generation chips is growing more prohibitive the closer we draw to the physical limits of miniaturization. And there are only so many rocks left. Demand for the high-purity silica sand used to manufacture silicon chips is so high that we’re facing a global, and irreversible, sand shortage; and the supply chain for commonly-used minerals, like tin, tungsten, tantalum, and gold, fuels bloody conflicts all over the world. If we expect 21st century computers to process the ever-growing amounts of data our culture produces — and we expect them to do so sustainably — we will need to reimagine how computers are built. We may even need to reimagine what a computer is to begin with.

It’s tempting to believe that computing paradigms are set in stone, so to speak. But there are already alternatives on the horizon. Quantum computing, for one, would shift us from a realm of binary ones and zeroes to one of qubits, making computers drastically faster than we can currently imagine, and the impossible — like unbreakable cryptography — newly possible. Still further off are computer architectures rebuilt around a novel electronic component called a memristor. Speculatively proposed by the physicist Leon Chua in 1971, first proven to exist in 2008, a memristor is a resistor with memory, which makes it capable of retaining data without power. A computer built around memristors could turn off and on like a light switch. It wouldn’t require the conductive layer of silicon necessary for traditional resistors. This would open computing to new substrates — the possibility, even, of integrating computers into atomically thin nano-materials. But these are architectural changes, not material ones.

For material changes, we must look farther afield, to an organism that occurs naturally only in the most fleeting of places. We need to glimpse into the loamy rot of a felled tree in the woods of the Pacific Northwest, or examine the glistening walls of a damp cave. That’s where we may just find the answer to computing’s intractable rock problem: down there, among the slime molds…

It’s time to reimagine what a computer could be: “Beyond Smart Rocks.”

(TotH to Patrick Tanguay.)

* “Moore’s Law is really a thing about human activity, it’s about vision, it’s about what you’re allowed to believe. Because people are really limited by their beliefs, they limit themselves by what they allow themselves to believe about what is possible.”  – Carver Mead

###

As we celebrate slime, we might send fantastically far-sighted birthday greetings to Hugo Gernsback, a Luxemborgian-American inventor, broadcast pioneer, writer, and publisher; he was born on this date in 1884.

Gernsback held 80 patents at the time of his death; he founded radio station WRNY, was involved in the first television broadcasts, and is considered a pioneer in amateur radio.  But it was as a writer and publisher that he probably left his most lasting mark:  In 1926, as owner/publisher of the magazine Modern Electrics, he filled a blank spot in his publication by dashing off the first chapter of a series called “Ralph 124C 41+.” The twelve installments of “Ralph” were filled with inventions unknown in 1926, including “television” (Gernsback is credited with introducing the word), fluorescent lighting, juke boxes, solar energy, television, microfilm, vending machines, and the device we now call radar.

The “Ralph” series was an astounding success with readers; and later that year Gernsback founded the first magazine devoted to science fiction, Amazing Stories.  Believing that the perfect sci-fi story is “75 percent literature interwoven with 25 percent science,” he coined the term “science fiction.”

Gernsback was a “careful” businessman, who was tight with the fees that he paid his writers– so tight that H. P. Lovecraft and Clark Ashton Smith referred to him as “Hugo the Rat.”

Still, his contributions to the genre as publisher were so significant that, along with H.G. Wells and Jules Verne, he is sometimes called “The Father of Science Fiction”; in his honor, the annual Science Fiction Achievement awards are called the “Hugos.”

(Coincidentally, today is also the birthday– in 1906– of Philo T. Farnsworth, the man who actually did invent television… and was thus the inspiration for the name “Philco.”)

[UPDATE- With thanks to friend MK for the catch:  your correspondent was relying on an apocryphal tale in attributing the Philco brand name to to Philo Farnsworth.  Farsworth did work with the company, and helped them enter the television business.  But the Philco trademark dates back to 1919– pre-television days– as a label for what was then the Philadelphia Storage Battery Company.]

Gernsback, wearing one of his inventions, TV Glasses

source

 

 

“When I die, I’m leaving my body to science fiction”*…

 

toppng.com-astronaut-porthole-space-spacecraft-weightlessness-gravity-3840x2400

 

If Somnium is the first science fiction book (which many people argue is true), then this is probably the first reference to the idea of zero gravity, or weightlessness.

“…for, as magnetic forces of the earth and moon both attract the body and hold it suspended, the effect is as if neither of them were attracting it…”
From Somnium (The Dream), by Johannes Kepler.
Published in 1634
Additional resources

Note that the word “gravity” is not used to describe the attraction between masses; Isaac Newton did not describe universal gravitation until 1687…

The first entry in Technovelgy’s (@Technovelgy)Timeline of Science Fiction Ideas, Technology, and Inventions.”  Starting in the 17th century, it contains hundreds of reminders– most linked to info on real-life inventors and inventions that realized the dreams– that imagination is often the inspiration for invention.

* Steven Wright

###

As we ponder precursors, we might recall that on this date in 1954 Gog premiered in Los Angeles.  The third film in Ivan Tors‘ “Office of Scientific Investigation” (OSI) trilogy, following The Magnetic Monster (1953) and Riders to the Stars (also 1954), it starred Richard Egan, Constance Dowling (in her final big-screen role), and Herbert Marshall in a cautionary tale of killer robots.

gog source

 

“The purpose of a writer is to keep civilization from destroying itself”*…

 

Chiang

 

Traditional “good vs. evil” stories follow a certain pattern: the world starts out as a good place, evil intrudes, good defeats evil, and the world goes back to being a good place. These stories are all about restoring the status quo, so they are implicitly conservative. Real science fiction stories follow a different pattern: the world starts out as a familiar place, a new discovery or invention disrupts everything, and the world is forever changed. These stories show the status quo being overturned, so they are implicitly progressive. (This observation is not original to me; it’s something that scholars of science fiction have long noted.) This was in the context of a discussion about the role of dystopias in science fiction. I said that while some dystopian stories suggest that doom is unavoidable, other ones are intended as cautionary tales, which implies we can do something to avoid the undesirable outcome…

A lot of dystopian stories posit variations on a Mad Max world where marauders roam the wasteland. That’s a kind of change no one wants to see. I think those qualify as doom. What I mean by disruption is not the end of civilization, but the end of a particular way of life. Aristocrats might have thought the world was ending when feudalism was abolished during the French Revolution, but the world didn’t end; the world changed. (The critic John Clute has said that the French Revolution was one of the things that gave rise to science fiction.)…

The familiar is always comfortable, but we need to make a distinction between what is actually desirable and what is simply what we’re accustomed to; sometimes those are the same, and sometimes they are not. The people who are the happiest with the status quo are the ones who benefit most from it, which is why the wealthy are usually conservative; the existing order works to their advantage. For example, right now there’s a discussion taking place about canceling student debt, and a related discussion about why there is such a difference in the type of financial relief available to individuals as opposed to giant corporations. The people who will be happiest to return to our existing system of debt are the ones who benefit from it, and making them uncomfortable might be a good idea…

How we may never go “back to normal”—and why that might be a good thing– Halimah Marcus‘ (@HalimahMarcus) interviews the estimable Ted Chiang.  Read it in full: “Ted Chiang Explains the Disaster Novel We All Suddenly Live In.”

* Albert Camus

###

As we put it all into perspective, we might recall that it was on this date in 1977 that Star Wars was released.  An epic space opera directed and co-written by George Lucas, it was both a box-office and critical success.  The highest-grossing film ever at the time (until the release of E.T. the Extra-Terrestrial in 1982), it is, when adjusted for inflation, the second-highest-grossing film in North America (behind Gone With The Wind).

The film won 6 Oscars for a variety of technical achievements.  As film critic Roger Ebert wrote in his book The Great Movies, “Like The Birth of a Nation and Citizen Kane, Star Wars was a technical watershed that influenced many of the movies that came after.”  It began a new generation of special effects and high-energy motion pictures.  The film was one of the first films to link genres together to invent a new, high-concept genre for filmmakers to build upon.  And, with Steven Spielberg’s Jaws, it shifted the film industry’s focus away from the personal filmmaking of the 1970s and toward fast-paced, big-budget blockbusters for younger audiences.

The film has been reissued many times and launched an industry of tie-in products, including novels, comics, video games, amusement park attractions, and merchandise including toys, games, and clothing. The film’s success led to two critically and commercially successful sequels, The Empire Strikes Back and Return of the Jedi, and later to a prequel trilogy, a sequel trilogy, two anthology films and various spin-off TV series.

220px-StarWarsMoviePoster1977 source

 

 

Written by LW

May 25, 2020 at 1:01 am

“The future is there… looking back at us. Trying to make sense of the fiction we will have become”*…

 

Octavia Butler

 

Tim Maughan, an accomplished science fiction writer himself, considers sci-fi works from the 1980s and 90s, and their predictive power.  Covering Bruce Sterling, William Gibson, Rudy Rucker, Steven King, P.D. James, an episode of Star Trek: Deep Space Nine, and Bladerunner, he reserves special attention for a most deserving subject…

When you imagine the future, what’s the first date that comes into your mind? 2050? 2070? The year that pops into your head is almost certainly related to how old you are — some point within our lifetimes yet distant enough to be mysterious, still just outside our grasp. For those of us growing up in the 1980s and ’90s — and for a large number of science fiction writers working in those decades — the 2020s felt like that future. A decade we would presumably live to see but also seemed sufficiently far away that it could be a world full of new technologies, social movements, or political changes. A dystopia or a utopia; a world both alien and familiar.

That future is, of course, now…

Two science fiction books set in the 2020s tower over everything else from that era in their terrifying prescience: Octavia Butler’s Parable of the Sower (1993) and Parable of the Talents (1998). These books by the late master kick off in 2024 Los Angeles and are set against a backdrop of a California that’s been ravaged by floods, storms, and droughts brought on by climate change. Middle- and working-class families huddle together in gated communities, attempting to escape the outside world through addictive pharmaceuticals and virtual reality headsets. New religions and conspiracy theory–chasing cults begin to emerge. A caravan of refugees head north to escape the ecological and social collapse, while a far-right extremist president backed by evangelical Christians comes to power using the chillingly familiar election slogan Make America Great Again.

Although it now feels like much of Butler’s Parable books might have been pulled straight from this afternoon’s Twitter or tonight’s evening news, some elements are more far-fetched. The second book ends with followers of the new religion founded by the central character leaving Earth in a spaceship to colonize Alpha Centauri. Butler originally planned to write a third book following the fates of these interstellar explorers but, sadly, passed away in 2005 before she had a chance. She left us with a duology that remains more grounded and scarily familiar to those of us struggling to come to terms with the everyday dystopias that the real 2020s seem to be already presenting us.

Not that this remarkable accuracy was ever her objective.

“This was not a book about prophecy; this was an if-this-goes-on story,” Butler said about the books during a talk at MIT in 1998. “This was a cautionary tale, although people have told me it was prophecy. All I have to say to that is I certainly hope not.”

In the same talk, Butler describes in detail the fears that drove her to write this warning: the debate over climate change, the eroding of workers’ rights, the rise of the private prison industry, and the media’s increasing refusal to talk about all of these in favor of focusing on soundbite propaganda and celebrity news. Again, these are fears that feel instantly familiar today…

What Blade Runner, cyberpunk– and Octavia Butler– had to say about the age we’re entering now: “How Science Fiction Imagined the 2020s.”

* William Gibson, Pattern Recognition

###

As we honor prophets, we might recall that it was on this date in 1984 that Apple aired an epoch-making commercial, “1984” (directed by Blade Runner director Ridley Scott),  during Superbowl XVIII– for the first and only time.  Two days later, the first Apple Macintosh went on sale.

 

Written by LW

January 22, 2020 at 1:01 am

“Total annihilation has a way of sharpening people’s minds”*…

 

War of the Worlds

 

HG Wells was the great modern prophet of apocalypse…

In five fecund years, from 1895 to 1900, he wrote 12 books, including the ‘scientific romances’ that made his name and laid the foundations of modern science fiction — The Time Machine, The War of the Worlds, The Island of Dr Moreau, The Invisible Man. He paved the way for so much of what came after — the sci-fi of Huxley, Orwell, Olaf Stapledon, Arthur C. Clarke, JG Ballard and Michael Crichton, and his books have inspired over 30 films, with The Invisible Man set for another remake this year…

His books — both fiction and non-fiction — are tales of apocalypse, which in the ancient Greek etymology means ‘the unveiling or unfolding of things not previously known and which could not be known apart from the unveiling’.

What you meet in Wells’ books, again and again, is the violent uncovering of the new, the ripping back of the lace curtain of Victorian customs. Like Ballard, Wells had a sense of how suddenly and utterly things can change, how long familiar and ingrained customs can disappear in a moment. Victorian England must have seemed like it would stay the same forever and ever. And then, suddenly, Queen Victoria is removed ‘like a great paperweight’, and everything is in flux.

Australians are learning that today — how everything we take for granted — homes, food supplies, electricity, water, clean air, even law and order — can be taken from one in an instant. Likewise, The War of the Worlds gave complacent imperial Victorians a sudden sense what it’s like to be conquered and humiliated, to be scrabbling for survival. ‘I felt a sense of dethronement, a persuasion that I was no longer a master, but an animal among the animals, under the Martian heel’…

What can he teach us about our present moment? How can we survive and endure the apocalyptic unravelling of hydrocarbon capitalism, which is what (I suggest) we vividly see happening today. The most important lessons he gave us are (1) take the Long View and (2) don’t turn away from technological innovation, however dangerous and unsettling it is…

Jules Evans (on a return visit, having supplied yesterday’s subject) explains: “What HG Wells can teach us about surviving apocalypse.”

* Yuval Noah Harari, 21 Lessons for the 21st Century

###

As we batten down the hatches, we might recall that it was on this date in 749 that a devastating earthquake struck parts of Palestine and the Transjordan, epicentered in Galilee.  The cities of Tiberias, Beit She’an, Hippos, and Pella were largely destroyed, while many other cities across the Levant were heavily damaged; the casualties numbered in the tens of thousands.

earthquake

Scythopolis (Beit She’an) was one of the cities destroyed in the earthquake of 749

source

 

%d bloggers like this: