(Roughly) Daily

Posts Tagged ‘Science Fiction

“We’re long on high principles and short on simple human understanding”*…

Really, most science fiction is about economics. What makes most future visions interesting is not just the technical particulars of the cool new Stuff, but the social ramifications. Here are some of the sci-fi books that I thought dealt with important economic issues in the most insightful and interesting ways. I also chose only books that I think are well-written, with well-conceived characters, engaging plots, and skillful writing.

1. A Deepness in the Sky, by Vernor Vinge

In addition to being quite possibly the best science fiction novel I’ve ever read, Deepness is also a great meditation on public economics. When Vernor Vinge became famous in the 80s, he was a hard-core libertarian – his novel The Peace War, and its sequel short story “The Ungoverned”, are like a Real Business Cycle model come to life, with lone-wolf genius engineers teaming up with private police forces to bring down a fascist technocratic government made up of…university administrators. Ha. But by the 90s, Vinge’s views on government and markets had become markedly more nuanced – in the swashbuckling space opera A Fire Upon the Deep, we see private security forces failing miserably when faced with a powerful external threat (in fact, that book made me think of the “Tamerlane Principle“). Security, Vinge realizes, is a public good.

In Deepness, Vinge adds another public good: Research. The narrative of Deepness is split between a race of spider-people with roughly 20th-century technology, and a spacefaring guild of human merchants called the Qeng Ho. On the spider world, the protagonist is a brilliant scientist named Sherkaner Underhill, who is basically a Von Neumann or Feynman type. Sherkaner is the ultimate lone genius, but he ends up needing the government to fund his research. In space, meanwhile, the heroic merchant entrepreneur Pham Nuwen (who is a recurring protagonist in Vinge novels) struggles to decide whether he should turn his merchant fleet into an interstellar government. Governments, he finds, are good at producing really new scientific breakthroughs, but eventually they become unwieldy and stifle the economy and society, then collapse under their own institutional weight. The very very end of the book is – or at least, seemed to me to be – a metaphor for the Great Stagnation and the death (and future rebirth) of Big Science…

Seventeen other wonderful recommendations from the always-insightful economist and social/political analyst Noah Smith (@Noahpinion): “Science fiction novels for economists.” (Your correspondent has read many/most of them and enthusiastically seconds the suggestions.)

* Vernor Vinge, A Deepness in the Sky

###

As we celebrate informative speculation, we might recall that it was on this date in 1949 that George Orwell published his masterpiece of dystopian speculative fiction, Nineteen Eighty-Four, and introduced terms like “Big Brother,” “doublethink,” “thoughtcrime,” “Newspeak,” and “Memory hole” into the vernacular.

source

Written by (Roughly) Daily

June 8, 2021 at 1:01 am

“So many books, so little time”*…

Dear The Sophist, 

I own a lot of books, and nearly enough shelves to fit them. I haven’t read most of them—has anyone with a lot of books read most of them?—yet I still get impulses to buy more. Can you please tell me why it’s OK for me to buy more books? I should add that I live with a partner who doesn’t own a lot of books, but tolerates mine so far. So far.

—Tome-escent

Dear Volume Purchaser,

Books are ridiculous objects to buy, aren’t they? For the sake of spending a day or two, maybe a week, with some author’s thoughts and words, you take custody of this physical item that sticks around, and around, as more and more others accumulate along with it. You look at them, almost unseeingly, day after day; the walls of your rooms press in; you pay extra money to the movers to drag the extra weight around from one dwelling to the next, all because you read an interesting review once or a cover caught your eye in a bookstore.  

You know what else is ridiculous? The sheer impermanence of thought. The constant yet ephemeral flickering of partial understanding across the synapses in our wet and mortal brains, and the dry circuits of the junky and even more short-lived electronic ersatz brains we rely on for backup. A book is an investment against forgetting and death—a poor investment, but it beats the alternatives. It is a slippery yet real toehold on eternity,,, If you stop the flow of new books, you stop this flow of possibilities…

Too many books? Tom Scocca (@tomscocca) explains that there’s no such thing as too many books. (via the ever-illuminating Today in Tabs)

And lest one fear that the only option is to buy books, remember the Public Library…

Central Library, Kansas City (source)

* Frank Zappa

###

As we reorganize our shelves, we might spare a thought for someone whose works definitely deserve places of honor thereon, Octavia Estelle Butler; she died in this date in 2006. An African American woman science fiction author, she was a rarity in her field. But her primary distinction was her extraordinary talent, as manifest in novels and stories that stretch the imagination even as they explore the all-too-real truths of the human condition. She was a multiple recipient of both the Hugo and Nebula awards, and became (in 1995) the first science-fiction writer to receive a MacArthur Fellowship.

It’s measure of her insight that her work– perhaps especially her “Parable” series— is being re-discovered as painfully prescient of our current times.

source

Written by (Roughly) Daily

February 24, 2021 at 1:01 am

“The details are not the details. They make the design.”*…

It’s 2020 and our systems are failing us. We are increasingly reliant on technology that automates bias. We are celebrating “essential workers” while they are underpaid and their work is precarious. We are protesting in the streets because of policing systems that put black and brown people at risk every day. We use apps for travel, shopping, and transportation that productize exploitative labor practices. The list goes on and on.

How did we get here? These systems didn’t just emerge of their own accord. They were crafted by people who made hundreds of decisions, big and small, that led to the outcomes we see now. In other words, these systems and all of their component parts were designed. And for the most part, they were designed with processes intended to create positive user experiences. So what went wrong? Might the way we approach design be contributing to the problems we now experience?

It’s unlikely that the techniques that got us into this situation will be the ones to get us out of it. In this essay, we’re going to take a deeper look at dominant design practices — specifically user-centered design — to identify where our frameworks might be failing us and how we can expand our design practices to close those gaps.

Any framework is a lens through which you see things. A lens allows you to see some things quite well, but almost always at the expense of obscuring others. Prior to the development of user-centered design, technological experiences were primarily designed through the lens of business needs. The needs of the user were only considered insofar as they furthered or hindered those goals, but it was the bottom line that was firmly the focal point of that approach.

User-centered design (UCD) was developed in reaction to those blind spots. It advocated for a design practice that instead focused on the person using the technology, and was intended to create experiences based on an understanding of their needs and goals. As designers, we’ve spent much of the last 25 years convincing our peers of the virtues of putting user needs at the center of our design process.

This practice has produced some amazing products, services and technical innovations. And for designers who entered the industry in the past decade or so, UCD has become a default mindset and approach. By empathizing with users and designing with their needs and wants in-mind, we have strived to create products that are more helpful, more intuitive, and less stressful. Certainly many of the digital tools & platforms we use today would not have been possible without the contributions of designers and the user-centered approach.

However, like any lens, UCD also has its own blind spots, and those have played a role in leading us to our current state…

As the world grows increasingly complex, the limitations of user-centered design are becoming painfully obvious. Alexis Lloyd (@alexislloyd) on what’s gone wrong and how we can fix it: “Camera Obscura: Beyond the lens of user-centered design.

[Via Patrick Tanguay‘s ever-illuminating Sentiers]

For an amusingly– and amazingly– apposite example of inclusively-empathetic design, see “‘If the aliens lay eggs, how does that affect architecture?’: sci-fi writers on how they build their worlds.”

* Charles Eames

###

As we ideate inclusively, we might recall that on this date in 1993 (following President George H.W. Bush’s executive order in 1992) Martin Luther King Jr. Day was officially proclaimed a holiday the first time in all 50 states. Bush’s order was not fully implemented until 2000, when Utah, the last state fully to recognize the holiday, formally observed it. (Utah had previously celebrated the holiday at the same time but under the name Human Rights Day.)

220px-Martin_Luther_King,_Jr.

 source

Written by (Roughly) Daily

January 18, 2021 at 1:01 am

“The nations of the world must now stay united in the struggle against unknown forces instead of fighting each other”*…

Toho’s [see hereThe Mysterians (1957) is a mammoth sci-fi spectacle, featuring giant lasers, flying saucers, underground domes, alien invaders, and robot monsters. Lying beneath its visual prowess is a set of questions, themes, and ideas that elevate The Mysterians as one of the decade’s most fascinating films. It asserts a warning for humanity: don’t misuse science. For 1957, in the midst of a spiraling nuclear arms race between the United States and the Soviet Union, the film is chilling; but when examined through the lens of 2020, The Mysterians is arguably even more frightening today.  

In the film, a series of earthquakes and forest fires precedes the appearance of a giant robot, Mogera. The mechanical monster wreaks havoc before it is blown up by the self-defence forces. The next day, a gigantic dome emerges from the ground, and we are introduced to the robot’s creators: the Mysterians. They beckon key scientists to meet them in their base, where they explain themselves as a race ravaged by atomic war. The Mysterians want three kilometres of land on which to live, but they also have an unpleasant stipulation. The Mysterians’ bodies are so damaged by radiation that they can no longer birth healthy offspring; and so, they want to mate with human women. Having already used Mogera to show that conflict is useless, the Mysterians appear to have the upper hand. However, forces from East and West unite, and Earth is poised to take on the Mysterian menace. 

The Mysterians features Akihiko Hirata in a role similar to his turn in Godzilla (1954). Hirata plays the enigmatic Shiraishi, a scientist who discovered the home planet of the Mysterians, Mysteroid. Shiraishi disappears before the Mysterians emerge, and we later discover that he has joined them. Seduced by their scientific achievements, Shiraishi admires the Mysterians; he believes that they simply wish to stop mankind from destroying itself, ignorant to their real plans for conquest. His assertion of science above all else prevents him from considering the ethical horrors that come with the Mysterians’ terms.  

Shiraishi is the personification of director Honda’s concerns over the misuse of science. “At that time I feared the danger of science, that whoever controlled it could take over the entire Earth”, Honda observed…

There’s also something else that makes The Mysterians all the more chilling today. The film’s concern that we could become like the Mysterians may have already come to pass – though not in a way that’s immediately apparent. The Mysterians have gone through an unimaginable horror in the form of atomic annihilation; and yet, they haven’t learned from their own nightmare. Instead of renouncing war or seeking peace, the Mysterians have looked to further conquest. For them, there is no recognition of the horror of war, just the restart of its engine.  

At the film’s climax, when the Earth has successfully fought back the invaders, we see scattered Mysterian bodies in their decimated dome. Many of their helmets are cracked and split, revealing their faces; they look human, with very little to distinguish them from us except their wounds and radiation scars. One looks at their damaged faces and sees a miserable, endless cycle…

The Mysterians is also striking in its depiction of a united Earth, with both Russia and America working side by side. The nations of the world join to fend off the new danger, with earthbound conflicts rendered banal in the face of collective oblivion… Director Ishiro Honda’s [see here] concern was in presenting a united planet – a recurring tenet of his genre work. Of The Mysterians, Honda said, “I would like to wipe away the [Cold War-era] notion of East versus West and convey a simple, universal aspiration for peace, the coming together of all humankind as one to create a peaceful society.” As noted by his biographers (Steve Ryfle and Ed Godziszewski), the visual composition of scenes involving international meetings shows a symmetry that affirms Honda’s egalitarian view; no one country is seen above or below another…

From Christopher Stewardson (@CF_Stewardson), an appreciation of a classic that’s all-too-timely again: “Thoughts on Film: The Mysterians.”

[[TotH to our buddies at Boing Boing]

* “Dr. Tanjiro Adachi,” The Mysterians

###

As we think globally, we might recall that it was on this date in 2018 that Sir David Attenborough (a naturalist and producer/host of the BBC’s epic Life on Our Planet) spoke at the UN’s climate summit in Poland. Sir David warned that climate change is humanity’s greatest threat in thousands of years, and that it could lead to the collapse of civilizations and the extinction of “much of the natural world.”

source

Written by (Roughly) Daily

December 3, 2020 at 1:01 am

“Moore’s Law is really a thing about human activity, it’s about vision, it’s about what you’re allowed to believe”*…

 

Karen-fungal-computing-2

 

In moments of technological frustration, it helps to remember that a computer is basically a rock. That is its fundamental witchcraft, or ours: for all its processing power, the device that runs your life is just a complex arrangement of minerals animated by electricity and language. Smart rocks. The components are mined from the Earth at great cost, and they eventually return to the Earth, however poisoned. This rock-and-metal paradigm has mostly served us well. The miniaturization of metallic components onto wafers of silicon — an empirical trend we call Moore’s Law — has defined the last half-century of life on Earth, giving us wristwatch computers, pocket-sized satellites and enough raw computational power to model the climate, discover unknown molecules, and emulate human learning.

But there are limits to what a rock can do. Computer scientists have been predicting the end of Moore’s Law for decades. The cost of fabricating next-generation chips is growing more prohibitive the closer we draw to the physical limits of miniaturization. And there are only so many rocks left. Demand for the high-purity silica sand used to manufacture silicon chips is so high that we’re facing a global, and irreversible, sand shortage; and the supply chain for commonly-used minerals, like tin, tungsten, tantalum, and gold, fuels bloody conflicts all over the world. If we expect 21st century computers to process the ever-growing amounts of data our culture produces — and we expect them to do so sustainably — we will need to reimagine how computers are built. We may even need to reimagine what a computer is to begin with.

It’s tempting to believe that computing paradigms are set in stone, so to speak. But there are already alternatives on the horizon. Quantum computing, for one, would shift us from a realm of binary ones and zeroes to one of qubits, making computers drastically faster than we can currently imagine, and the impossible — like unbreakable cryptography — newly possible. Still further off are computer architectures rebuilt around a novel electronic component called a memristor. Speculatively proposed by the physicist Leon Chua in 1971, first proven to exist in 2008, a memristor is a resistor with memory, which makes it capable of retaining data without power. A computer built around memristors could turn off and on like a light switch. It wouldn’t require the conductive layer of silicon necessary for traditional resistors. This would open computing to new substrates — the possibility, even, of integrating computers into atomically thin nano-materials. But these are architectural changes, not material ones.

For material changes, we must look farther afield, to an organism that occurs naturally only in the most fleeting of places. We need to glimpse into the loamy rot of a felled tree in the woods of the Pacific Northwest, or examine the glistening walls of a damp cave. That’s where we may just find the answer to computing’s intractable rock problem: down there, among the slime molds…

It’s time to reimagine what a computer could be: “Beyond Smart Rocks.”

(TotH to Patrick Tanguay.)

* “Moore’s Law is really a thing about human activity, it’s about vision, it’s about what you’re allowed to believe. Because people are really limited by their beliefs, they limit themselves by what they allow themselves to believe about what is possible.”  – Carver Mead

###

As we celebrate slime, we might send fantastically far-sighted birthday greetings to Hugo Gernsback, a Luxemborgian-American inventor, broadcast pioneer, writer, and publisher; he was born on this date in 1884.

Gernsback held 80 patents at the time of his death; he founded radio station WRNY, was involved in the first television broadcasts, and is considered a pioneer in amateur radio.  But it was as a writer and publisher that he probably left his most lasting mark:  In 1926, as owner/publisher of the magazine Modern Electrics, he filled a blank spot in his publication by dashing off the first chapter of a series called “Ralph 124C 41+.” The twelve installments of “Ralph” were filled with inventions unknown in 1926, including “television” (Gernsback is credited with introducing the word), fluorescent lighting, juke boxes, solar energy, television, microfilm, vending machines, and the device we now call radar.

The “Ralph” series was an astounding success with readers; and later that year Gernsback founded the first magazine devoted to science fiction, Amazing Stories.  Believing that the perfect sci-fi story is “75 percent literature interwoven with 25 percent science,” he coined the term “science fiction.”

Gernsback was a “careful” businessman, who was tight with the fees that he paid his writers– so tight that H. P. Lovecraft and Clark Ashton Smith referred to him as “Hugo the Rat.”

Still, his contributions to the genre as publisher were so significant that, along with H.G. Wells and Jules Verne, he is sometimes called “The Father of Science Fiction”; in his honor, the annual Science Fiction Achievement awards are called the “Hugos.”

(Coincidentally, today is also the birthday– in 1906– of Philo T. Farnsworth, the man who actually did invent television… and was thus the inspiration for the name “Philco.”)

[UPDATE- With thanks to friend MK for the catch:  your correspondent was relying on an apocryphal tale in attributing the Philco brand name to to Philo Farnsworth.  Farsworth did work with the company, and helped them enter the television business.  But the Philco trademark dates back to 1919– pre-television days– as a label for what was then the Philadelphia Storage Battery Company.]

Gernsback, wearing one of his inventions, TV Glasses

source

 

 

%d bloggers like this: