(Roughly) Daily

Posts Tagged ‘Science Fiction

“The details are not the details. They make the design.”*…

It’s 2020 and our systems are failing us. We are increasingly reliant on technology that automates bias. We are celebrating “essential workers” while they are underpaid and their work is precarious. We are protesting in the streets because of policing systems that put black and brown people at risk every day. We use apps for travel, shopping, and transportation that productize exploitative labor practices. The list goes on and on.

How did we get here? These systems didn’t just emerge of their own accord. They were crafted by people who made hundreds of decisions, big and small, that led to the outcomes we see now. In other words, these systems and all of their component parts were designed. And for the most part, they were designed with processes intended to create positive user experiences. So what went wrong? Might the way we approach design be contributing to the problems we now experience?

It’s unlikely that the techniques that got us into this situation will be the ones to get us out of it. In this essay, we’re going to take a deeper look at dominant design practices — specifically user-centered design — to identify where our frameworks might be failing us and how we can expand our design practices to close those gaps.

Any framework is a lens through which you see things. A lens allows you to see some things quite well, but almost always at the expense of obscuring others. Prior to the development of user-centered design, technological experiences were primarily designed through the lens of business needs. The needs of the user were only considered insofar as they furthered or hindered those goals, but it was the bottom line that was firmly the focal point of that approach.

User-centered design (UCD) was developed in reaction to those blind spots. It advocated for a design practice that instead focused on the person using the technology, and was intended to create experiences based on an understanding of their needs and goals. As designers, we’ve spent much of the last 25 years convincing our peers of the virtues of putting user needs at the center of our design process.

This practice has produced some amazing products, services and technical innovations. And for designers who entered the industry in the past decade or so, UCD has become a default mindset and approach. By empathizing with users and designing with their needs and wants in-mind, we have strived to create products that are more helpful, more intuitive, and less stressful. Certainly many of the digital tools & platforms we use today would not have been possible without the contributions of designers and the user-centered approach.

However, like any lens, UCD also has its own blind spots, and those have played a role in leading us to our current state…

As the world grows increasingly complex, the limitations of user-centered design are becoming painfully obvious. Alexis Lloyd (@alexislloyd) on what’s gone wrong and how we can fix it: “Camera Obscura: Beyond the lens of user-centered design.

[Via Patrick Tanguay‘s ever-illuminating Sentiers]

For an amusingly– and amazingly– apposite example of inclusively-empathetic design, see “‘If the aliens lay eggs, how does that affect architecture?’: sci-fi writers on how they build their worlds.”

* Charles Eames

###

As we ideate inclusively, we might recall that on this date in 1993 (following President George H.W. Bush’s executive order in 1992) Martin Luther King Jr. Day was officially proclaimed a holiday the first time in all 50 states. Bush’s order was not fully implemented until 2000, when Utah, the last state fully to recognize the holiday, formally observed it. (Utah had previously celebrated the holiday at the same time but under the name Human Rights Day.)

220px-Martin_Luther_King,_Jr.

 source

“The nations of the world must now stay united in the struggle against unknown forces instead of fighting each other”*…

Toho’s [see hereThe Mysterians (1957) is a mammoth sci-fi spectacle, featuring giant lasers, flying saucers, underground domes, alien invaders, and robot monsters. Lying beneath its visual prowess is a set of questions, themes, and ideas that elevate The Mysterians as one of the decade’s most fascinating films. It asserts a warning for humanity: don’t misuse science. For 1957, in the midst of a spiraling nuclear arms race between the United States and the Soviet Union, the film is chilling; but when examined through the lens of 2020, The Mysterians is arguably even more frightening today.  

In the film, a series of earthquakes and forest fires precedes the appearance of a giant robot, Mogera. The mechanical monster wreaks havoc before it is blown up by the self-defence forces. The next day, a gigantic dome emerges from the ground, and we are introduced to the robot’s creators: the Mysterians. They beckon key scientists to meet them in their base, where they explain themselves as a race ravaged by atomic war. The Mysterians want three kilometres of land on which to live, but they also have an unpleasant stipulation. The Mysterians’ bodies are so damaged by radiation that they can no longer birth healthy offspring; and so, they want to mate with human women. Having already used Mogera to show that conflict is useless, the Mysterians appear to have the upper hand. However, forces from East and West unite, and Earth is poised to take on the Mysterian menace. 

The Mysterians features Akihiko Hirata in a role similar to his turn in Godzilla (1954). Hirata plays the enigmatic Shiraishi, a scientist who discovered the home planet of the Mysterians, Mysteroid. Shiraishi disappears before the Mysterians emerge, and we later discover that he has joined them. Seduced by their scientific achievements, Shiraishi admires the Mysterians; he believes that they simply wish to stop mankind from destroying itself, ignorant to their real plans for conquest. His assertion of science above all else prevents him from considering the ethical horrors that come with the Mysterians’ terms.  

Shiraishi is the personification of director Honda’s concerns over the misuse of science. “At that time I feared the danger of science, that whoever controlled it could take over the entire Earth”, Honda observed…

There’s also something else that makes The Mysterians all the more chilling today. The film’s concern that we could become like the Mysterians may have already come to pass – though not in a way that’s immediately apparent. The Mysterians have gone through an unimaginable horror in the form of atomic annihilation; and yet, they haven’t learned from their own nightmare. Instead of renouncing war or seeking peace, the Mysterians have looked to further conquest. For them, there is no recognition of the horror of war, just the restart of its engine.  

At the film’s climax, when the Earth has successfully fought back the invaders, we see scattered Mysterian bodies in their decimated dome. Many of their helmets are cracked and split, revealing their faces; they look human, with very little to distinguish them from us except their wounds and radiation scars. One looks at their damaged faces and sees a miserable, endless cycle…

The Mysterians is also striking in its depiction of a united Earth, with both Russia and America working side by side. The nations of the world join to fend off the new danger, with earthbound conflicts rendered banal in the face of collective oblivion… Director Ishiro Honda’s [see here] concern was in presenting a united planet – a recurring tenet of his genre work. Of The Mysterians, Honda said, “I would like to wipe away the [Cold War-era] notion of East versus West and convey a simple, universal aspiration for peace, the coming together of all humankind as one to create a peaceful society.” As noted by his biographers (Steve Ryfle and Ed Godziszewski), the visual composition of scenes involving international meetings shows a symmetry that affirms Honda’s egalitarian view; no one country is seen above or below another…

From Christopher Stewardson (@CF_Stewardson), an appreciation of a classic that’s all-too-timely again: “Thoughts on Film: The Mysterians.”

[[TotH to our buddies at Boing Boing]

* “Dr. Tanjiro Adachi,” The Mysterians

###

As we think globally, we might recall that it was on this date in 2018 that Sir David Attenborough (a naturalist and producer/host of the BBC’s epic Life on Our Planet) spoke at the UN’s climate summit in Poland. Sir David warned that climate change is humanity’s greatest threat in thousands of years, and that it could lead to the collapse of civilizations and the extinction of “much of the natural world.”

source

“Moore’s Law is really a thing about human activity, it’s about vision, it’s about what you’re allowed to believe”*…

 

Karen-fungal-computing-2

 

In moments of technological frustration, it helps to remember that a computer is basically a rock. That is its fundamental witchcraft, or ours: for all its processing power, the device that runs your life is just a complex arrangement of minerals animated by electricity and language. Smart rocks. The components are mined from the Earth at great cost, and they eventually return to the Earth, however poisoned. This rock-and-metal paradigm has mostly served us well. The miniaturization of metallic components onto wafers of silicon — an empirical trend we call Moore’s Law — has defined the last half-century of life on Earth, giving us wristwatch computers, pocket-sized satellites and enough raw computational power to model the climate, discover unknown molecules, and emulate human learning.

But there are limits to what a rock can do. Computer scientists have been predicting the end of Moore’s Law for decades. The cost of fabricating next-generation chips is growing more prohibitive the closer we draw to the physical limits of miniaturization. And there are only so many rocks left. Demand for the high-purity silica sand used to manufacture silicon chips is so high that we’re facing a global, and irreversible, sand shortage; and the supply chain for commonly-used minerals, like tin, tungsten, tantalum, and gold, fuels bloody conflicts all over the world. If we expect 21st century computers to process the ever-growing amounts of data our culture produces — and we expect them to do so sustainably — we will need to reimagine how computers are built. We may even need to reimagine what a computer is to begin with.

It’s tempting to believe that computing paradigms are set in stone, so to speak. But there are already alternatives on the horizon. Quantum computing, for one, would shift us from a realm of binary ones and zeroes to one of qubits, making computers drastically faster than we can currently imagine, and the impossible — like unbreakable cryptography — newly possible. Still further off are computer architectures rebuilt around a novel electronic component called a memristor. Speculatively proposed by the physicist Leon Chua in 1971, first proven to exist in 2008, a memristor is a resistor with memory, which makes it capable of retaining data without power. A computer built around memristors could turn off and on like a light switch. It wouldn’t require the conductive layer of silicon necessary for traditional resistors. This would open computing to new substrates — the possibility, even, of integrating computers into atomically thin nano-materials. But these are architectural changes, not material ones.

For material changes, we must look farther afield, to an organism that occurs naturally only in the most fleeting of places. We need to glimpse into the loamy rot of a felled tree in the woods of the Pacific Northwest, or examine the glistening walls of a damp cave. That’s where we may just find the answer to computing’s intractable rock problem: down there, among the slime molds…

It’s time to reimagine what a computer could be: “Beyond Smart Rocks.”

(TotH to Patrick Tanguay.)

* “Moore’s Law is really a thing about human activity, it’s about vision, it’s about what you’re allowed to believe. Because people are really limited by their beliefs, they limit themselves by what they allow themselves to believe about what is possible.”  – Carver Mead

###

As we celebrate slime, we might send fantastically far-sighted birthday greetings to Hugo Gernsback, a Luxemborgian-American inventor, broadcast pioneer, writer, and publisher; he was born on this date in 1884.

Gernsback held 80 patents at the time of his death; he founded radio station WRNY, was involved in the first television broadcasts, and is considered a pioneer in amateur radio.  But it was as a writer and publisher that he probably left his most lasting mark:  In 1926, as owner/publisher of the magazine Modern Electrics, he filled a blank spot in his publication by dashing off the first chapter of a series called “Ralph 124C 41+.” The twelve installments of “Ralph” were filled with inventions unknown in 1926, including “television” (Gernsback is credited with introducing the word), fluorescent lighting, juke boxes, solar energy, television, microfilm, vending machines, and the device we now call radar.

The “Ralph” series was an astounding success with readers; and later that year Gernsback founded the first magazine devoted to science fiction, Amazing Stories.  Believing that the perfect sci-fi story is “75 percent literature interwoven with 25 percent science,” he coined the term “science fiction.”

Gernsback was a “careful” businessman, who was tight with the fees that he paid his writers– so tight that H. P. Lovecraft and Clark Ashton Smith referred to him as “Hugo the Rat.”

Still, his contributions to the genre as publisher were so significant that, along with H.G. Wells and Jules Verne, he is sometimes called “The Father of Science Fiction”; in his honor, the annual Science Fiction Achievement awards are called the “Hugos.”

(Coincidentally, today is also the birthday– in 1906– of Philo T. Farnsworth, the man who actually did invent television… and was thus the inspiration for the name “Philco.”)

[UPDATE- With thanks to friend MK for the catch:  your correspondent was relying on an apocryphal tale in attributing the Philco brand name to to Philo Farnsworth.  Farsworth did work with the company, and helped them enter the television business.  But the Philco trademark dates back to 1919– pre-television days– as a label for what was then the Philadelphia Storage Battery Company.]

Gernsback, wearing one of his inventions, TV Glasses

source

 

 

“When I die, I’m leaving my body to science fiction”*…

 

toppng.com-astronaut-porthole-space-spacecraft-weightlessness-gravity-3840x2400

 

If Somnium is the first science fiction book (which many people argue is true), then this is probably the first reference to the idea of zero gravity, or weightlessness.

“…for, as magnetic forces of the earth and moon both attract the body and hold it suspended, the effect is as if neither of them were attracting it…”
From Somnium (The Dream), by Johannes Kepler.
Published in 1634
Additional resources

Note that the word “gravity” is not used to describe the attraction between masses; Isaac Newton did not describe universal gravitation until 1687…

The first entry in Technovelgy’s (@Technovelgy)Timeline of Science Fiction Ideas, Technology, and Inventions.”  Starting in the 17th century, it contains hundreds of reminders– most linked to info on real-life inventors and inventions that realized the dreams– that imagination is often the inspiration for invention.

* Steven Wright

###

As we ponder precursors, we might recall that on this date in 1954 Gog premiered in Los Angeles.  The third film in Ivan Tors‘ “Office of Scientific Investigation” (OSI) trilogy, following The Magnetic Monster (1953) and Riders to the Stars (also 1954), it starred Richard Egan, Constance Dowling (in her final big-screen role), and Herbert Marshall in a cautionary tale of killer robots.

gog source

 

“The purpose of a writer is to keep civilization from destroying itself”*…

 

Chiang

 

Traditional “good vs. evil” stories follow a certain pattern: the world starts out as a good place, evil intrudes, good defeats evil, and the world goes back to being a good place. These stories are all about restoring the status quo, so they are implicitly conservative. Real science fiction stories follow a different pattern: the world starts out as a familiar place, a new discovery or invention disrupts everything, and the world is forever changed. These stories show the status quo being overturned, so they are implicitly progressive. (This observation is not original to me; it’s something that scholars of science fiction have long noted.) This was in the context of a discussion about the role of dystopias in science fiction. I said that while some dystopian stories suggest that doom is unavoidable, other ones are intended as cautionary tales, which implies we can do something to avoid the undesirable outcome…

A lot of dystopian stories posit variations on a Mad Max world where marauders roam the wasteland. That’s a kind of change no one wants to see. I think those qualify as doom. What I mean by disruption is not the end of civilization, but the end of a particular way of life. Aristocrats might have thought the world was ending when feudalism was abolished during the French Revolution, but the world didn’t end; the world changed. (The critic John Clute has said that the French Revolution was one of the things that gave rise to science fiction.)…

The familiar is always comfortable, but we need to make a distinction between what is actually desirable and what is simply what we’re accustomed to; sometimes those are the same, and sometimes they are not. The people who are the happiest with the status quo are the ones who benefit most from it, which is why the wealthy are usually conservative; the existing order works to their advantage. For example, right now there’s a discussion taking place about canceling student debt, and a related discussion about why there is such a difference in the type of financial relief available to individuals as opposed to giant corporations. The people who will be happiest to return to our existing system of debt are the ones who benefit from it, and making them uncomfortable might be a good idea…

How we may never go “back to normal”—and why that might be a good thing– Halimah Marcus‘ (@HalimahMarcus) interviews the estimable Ted Chiang.  Read it in full: “Ted Chiang Explains the Disaster Novel We All Suddenly Live In.”

* Albert Camus

###

As we put it all into perspective, we might recall that it was on this date in 1977 that Star Wars was released.  An epic space opera directed and co-written by George Lucas, it was both a box-office and critical success.  The highest-grossing film ever at the time (until the release of E.T. the Extra-Terrestrial in 1982), it is, when adjusted for inflation, the second-highest-grossing film in North America (behind Gone With The Wind).

The film won 6 Oscars for a variety of technical achievements.  As film critic Roger Ebert wrote in his book The Great Movies, “Like The Birth of a Nation and Citizen Kane, Star Wars was a technical watershed that influenced many of the movies that came after.”  It began a new generation of special effects and high-energy motion pictures.  The film was one of the first films to link genres together to invent a new, high-concept genre for filmmakers to build upon.  And, with Steven Spielberg’s Jaws, it shifted the film industry’s focus away from the personal filmmaking of the 1970s and toward fast-paced, big-budget blockbusters for younger audiences.

The film has been reissued many times and launched an industry of tie-in products, including novels, comics, video games, amusement park attractions, and merchandise including toys, games, and clothing. The film’s success led to two critically and commercially successful sequels, The Empire Strikes Back and Return of the Jedi, and later to a prequel trilogy, a sequel trilogy, two anthology films and various spin-off TV series.

220px-StarWarsMoviePoster1977 source

 

 

Written by LW

May 25, 2020 at 1:01 am

%d bloggers like this: