(Roughly) Daily

Posts Tagged ‘computing

“Arguing that you don’t care about the right to privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say”*…

There’s a depressing sort of symmetry in the fact that our modern paradigms of privacy were developed in response to the proliferation of photography and their exploitation by tabloids. The seminal 1890 Harvard Law Review article The Right to Privacy—which every essay about data privacy is contractually obligated to cite—argued that the right of an individual to object to the publication of photographs ought to be considered part of a general ‘right to be let alone’.

30 years on, privacy is still largely conceived of as an individual thing, wherein we get to make solo decisions about when we want to be left alone and when we’re comfortable being trespassed upon. This principle undergirds the notice-and-consent model of data management, which you might also know as the pavlovian response to click “I agree” on any popup and login screen with little regard for the forty pages of legalese you might be agreeing to.

The thing is, the right to be left alone makes perfect sense when you’re managing information relationships between individuals, where there are generally pretty clear social norms around what constitutes a boundary violation. Reasonable people can and do disagree as to the level of privacy they expect, but if I invite you into my home and you snoop through my bedside table and read my diary, there isn’t much ambiguity about that being an invasion.

But in the age of ✨ networked computing ✨, this individual model of privacy just doesn’t scale anymore. There are too many exponentially intersecting relationships for any of us to keep in our head. It’s no longer just about what we tell a friend or the tax collector or even a journalist. It’s the digital footprint that we often unknowingly leave in our wake every time we interact with something online, and how all of those websites and apps and their shadowy partners talk to each other behind our backs. It’s the cameras in malls tracking our location and sometimes emotions, and it’s the license plate readers compiling a log of our movements.

At a time when governments and companies are increasingly investing in surveillance mechanisms under the guise of security and transparency, that scale is only going to keep growing. Our individual comfort about whether we are left alone is no longer the only, or even the most salient part of the story, and we need to think about privacy as a public good and a collective value.

I like thinking about privacy as being collective, because it feels like a more true reflection of the fact that our lives are made up of relationships, and information about our lives is social and contextual by nature. The fact that I have a sister also indicates that my sister has at least one sibling: me. If I took a DNA test through 23andme I’m not just disclosing information about me but also about everyone that I’m related to, none of whom are able to give consent. The privacy implications for familial DNA are pretty broad: this information might be used to sell or withhold products and services, expose family secrets, or implicate a future as-yet-unborn relative in a crime. I could email 23andme and ask them to delete my records, and they might eventually comply in a month or three. But my present and future relatives wouldn’t be able to do that, or even know that their privacy had been compromised at all.

Even with data that’s less fraught than our genome, our decisions about what we expose to the world have externalities for the people around us. I might think nothing of posting a photo of going out with my friends and mentioning the name of the bar, but I’ve just exposed our physical location to the internet. If one of my friends has had to deal with a stalker in their past, I could’ve put their physical safety at risk. Even if I’m careful to make the post friends-only, the people I trust are not the same as the people my friends trust. In an individual model of privacy, we are only as private as our least private friend.

Amidst the global pandemic, this might sound not dissimilar to public health. When I decide whether to wear a mask in public, that’s partially about how much the mask will protect me from airborne droplets. But it’s also—perhaps more significantly—about protecting everyone else from me.

Data collection isn’t always bad, but it is always risky. Sometimes that’s due to shoddy design and programming or lazy security practices. But even the best engineers often fail to build risk-free systems, by the very nature of systems.

Systems are easier to attack than they are to defend. If you want to defend a system, you have to make sure every part of it is perfectly implemented to guard against any possible vulnerabilities. Oftentimes, trying to defend a system means adding additional components, which just ends up creating more potential weak points. Whereas if you want to attack, all you have to do is find the one weakness that the systems designer missed. (Or, to paraphrase the IRA, you only have to be lucky once.)

This is true of all systems, digital or analog, but the thing that makes computer systems particularly vulnerable is that the same weaknesses can be deployed across millions of devices, in our phones and laptops and watches and toasters and refrigerators and doorbells. When a vulnerability is discovered in one system, an entire class of devices around the world is instantly a potential target, but we still have to go fix them one by one.

This is how the Equifax data leak happened. Equifax used a piece of open source software that had a security flaw in it, the people who work on that software found it and fixed it, and instead of diligently updating their systems Equifax hit the snooze button for four months and let hackers steal hundreds of millions of customer records. And while Equifax is definitely guilty of aforementioned lazy security practices, this incident also illustrates how fragile computer systems are. From the moment this bug was discovered, every server in the world that ran that software was at risk.

What’s worse, in many cases people weren’t even aware that their data was stored with Equifax. If you’re an adult who has had a job or a phone bill or interacted with a bank in the last seven years, your identifying information is collected by Equifax whether you like it or not. The only way to opt out would have been to be among the small percentage of overwhelmingly young, poor, and racialized people who have no credit histories, which significantly limits the scope of their ability to participate in the economy. How do you notice-and-consent your way out of that?

There unfortunately isn’t one weird trick to save democracy, but that doesn’t mean there aren’t lessons we can learn from history to figure out how to protect privacy as a public good. The scale and ubiquity of computers may be unprecedented, but so is the scale of our collective knowledge…

Read the full piece (and you should) for Jenny Zhang‘s (@phirephoenix) compelling case that we should treat– and protect– privacy as a public good, and explanation of how we might do that: “Left alone, together.” TotH to Sentiers.

[image above: source]

* Edward Snowden

###

As we think about each other, we might recall that it was on this date in 1939 that the first government appropriation was made to the support the construction of the Harvard Mark I computer.

Designer Howard Aiken had enlisted IBM as a partner in 1937; company chairman Thomas Watson Sr. personally approved the project and its funding. It was completed in 1944 (and put to work on a set war-related tasks, including calculations– overseen by John von Neumann— for the Manhattan Project). 

The Mark I was the industry’s largest electromechanical calculator… and it was large: 51 feet long, 8 feet high, and 2 feet deep; it weighed about 9,445 pounds  The basic calculating units had to be synchronized and powered mechanically, so they were operated by a 50-foot (15 m) drive shaft coupled to a 5 horsepower electric motor, which served as the main power source and system clock. It could do 3 additions or subtractions in a second; a multiplication took 6 seconds; a division took 15.3 seconds; and a logarithm or a trigonometric function took over a minute… ridiculously slow by today’s standards, but a huge advance in its time.

source

“The ancient Oracle said that I was the wisest of all the Greeks. It is because I alone, of all the Greeks, know that I know nothing.”*…

The site of the oracle at Dodona

Your correspondent will be off-line for the next 10 days or so; regular service will resume on or around April 26th. In the meantime, a meeting of the (very) old and the (very) new…

The Virtual Reality Oracle (VRO) is a first-person virtual reality experience of oracular divination at the ancient Greek site of Dodona circa 450 BCE. Immerse yourself in the lives of ordinary people and community leaders alike as they travel to Dodona to consult the gods. Inspired by the questions they posed on themes as wide-ranging as wellbeing, work, and theft, perhaps you in turn will ask your question of Zeus?…

Homer mentioned Dodona; now you can be there, then. An immersive experience of the ancient Greek gods: “Virtual Reality Oracle.”

* Socrates

###

As we look for answers, we might recall that it was on this date in 1977 that both the Apple II and Commodore PET 2001 personal computers were introduced at the first annual West Coast Computer Faire.

Ironically, Commodore had previously rejected purchasing the Apple II from Steve Jobs and Steve Wozniak, deciding to build their own computers. Both computers used the same processor, the MOS 6502, but the companies had two different design strategies and it showed on this day. Apple wanted to build computers with more features at a higher price point. Commodore wanted to sell less feature-filled computers at a lower price point. The Apple II had color, graphics, and sound selling for $1298. The Commodore PET only had a monochrome display and was priced at $795.

Note, it was very difficult finding a picture with both an original Apple II (not IIe) and Commodore PET 2001. I could only find this picture that also includes the TRS-80, another PC introduced later in 1977.

source
The photo mentioned above: The Apple II is back left; the PET, back right

source

Written by (Roughly) Daily

April 16, 2021 at 1:01 am

“The tribalizing power of the new electronic media, the way in which they return to us to the unified fields of the old oral cultures, to tribal cohesion and pre-individualist patterns of thought, is little understood”*…

Nokia was dominant in mobile phone sales from 1998 to around 2010. Nokia’s slogan: Connecting people.

It was amazing to connect with people in the late 90s/early 2000s. I don’t think we were lonely exactly. But maybe meeting people was somewhere between an opportunity, something novel, and, yes, a need – suddenly it was possible to find the right person, or the right community.

So, the zeitgeist of the early 2000s.

I ran across a previous zeitgeist in an article about Choose Your Own Adventure books. They appeared and became massively popular at the same time as text adventure computer games, but neither inspired the invention of the other. How? The real answer may lie far deeper in the cultural subconscious … in the zeitgeist of the 1980s.

1980s: you.

2000s: connection.

2020s: ?

Zeitgeists don’t lead and zeitgeists don’t follow.

I think when we spot some kind of macro trend in establishment consumer ads, it’s never going to be about presenting people with something entirely new. To resonate, it has to be familiar – the trajectory that the consumer is already on – but it also has to scratch an itch. The brand wants to be a helpful fellow traveller, if you like.

I wonder what the zeitgeist of the 2020s will be, or is already maybe. What deep human need will be simultaneously a comfort and an aspiration? There should be hints of it in popular culture already. (If I knew how to put my finger on it, I’d be an ad planner.)

If I had to guess then it would be something about belonging.

There was a hint of this in Reddit’s 5 second Super Bowl commercial which went hard on one their communities, r/WallStreetBets, ganging up to bring down hedge funds. Then we’ve got a couple of generations now who grew up with the idea of fandoms, and of course conspiracy theories like QAnon too. If you squint, you can kind of see this in the way Tesla operates: it’s a consumer brand but it’s also a passionate, combative cause.

Belonging to a tribe is about identity and strength, it’s solace and empowerment all at once. And also knowledge, certainty, and trust in an era of complexity, disinfo, and hidden agendas.

Given that backdrop, it’s maybe unsurprising that the trend in software is towards Discord servers and other virtual private neighbourhoods. But how else will this appear? And is it just the beginnings of something else, something bigger?

1980s (you), 2000s (connection). What’s the 2020s zeitgeist?” From Matt Webb (@genmon)

* Marshall McLuhan

###

As we double down on diversity, we might send well-connected birthday greetings to Joseph Carl Robnett Licklider; he was born on this date in 1015. Better known as “J.C,R.” or “Lick,” he was a prominent figure in the development of computing and computer science. He was especially impactful Considered the “Johnny Appleseed” of computing, he planted many of the seeds of computing in the digital age– escpecially via his idea of a universal computer network to easily transfer and retrieve information which his successors developed into the internet.

Robert Taylor, founder of Xerox PARC‘s Computer Science Laboratory and Digital Equipment Corporation‘s Systems Research Center, noted that “most of the significant advances in computer technology—including the work that my group did at Xerox PARC—were simply extrapolations of Lick’s vision. They were not really new visions of their own. So he was really the father of it all.”

source

Written by (Roughly) Daily

March 11, 2021 at 1:01 am

“As for memes, the word ‘meme’ is a cliche, which is to say it’s already a meme”*…

(Roughly) Daily began nearly two decades ago as a (roughly daily) email to friends. One of the earliest “editions” featured a then-current video (and the myriad reactions to and appropriations of it)…

As the Internet began crystallizing into its modern form—one that now arguably buttresses society as we know it—its anthropology of common language and references matured at a strange rate. But between the simple initialisms that emerged by the ’90s (ROFL!) and the modern world’s ecosystem of easily shared multimedia, a patchwork connection of users and sites had to figure out how to establish a base of shared references.

In some ways, the Internet as we know it really began… 20 years ago [this week], when a three-word phrase blew up: “All Your Base.”

On that day, a robo-voiced music video went live at Newgrounds.com, one of the Internet’s earliest and longest-lasting dumping grounds of Flash multimedia content, and went on to become one of the most beloved Internet videos of the 21st century. Though Flash support has since been scrapped across the entire Web-browsing ecosystem, Newgrounds continues to host the original video in a safe Flash emulator, if you’d like to see it as originally built instead of flipping through dozens of YouTube rips.

In an online world where users were previously drawn to the likes of the Hamster Dance, exactly how the heck did this absurdity become one of the Internet’s first bona fide memes?

One possible reason is that the “All Your Base Are Belong To Us” video appealed to the early Internet’s savviest users, since it was sourced from an unpopular ’90s video game. Zero Wing launched on the Sega Genesis in 1992… Across the earliest post-BBS Internet, underappreciated 8-bit and 16-bit games changed hands at a crazy rate thanks to small file sizes and 56K modems—and if you were an early Internet user, you were likely a target audience for activities like emulating a Sega Genesis on a Pentium II-powered PC.

That was the first step to exposing the world to Zero Wing‘s inadvertently hilarious text, translated from Japanese to English by an apparent amateur. Classic Japanese games are littered with crappy translations, and even mega-successful publishers like Nintendo are guilty of letting bad phrases slip into otherwise classic games. But Zero Wing soundly trounced other examples of wacky mistranslations thanks to its dramatic opening sequence pitting the generic “CAPTAIN” against a half-robot, half-demon creature in a robe named “CATS.”

Its wackiness circulated on the early Internet as a tiny GIF, with each of its silly phrases (“How are you gentlemen!!”, “Somebody set up us the bomb”) pulling significant weight in terms of weirdly placed clauses and missing punctuation. Early Internet communities poked fun at the sequence by creating and sharing gag images that had the silly text inserted in various ways. But it wasn’t until the February 2001 video, as uploaded by a user who went by “Bad-CRC,” that the meme’s appeal began to truly explode. The video presents the original Sega Genesis graphics, dubbed over with monotone, machine-generated speech reading each phrase. “You are on your way to destruction” in this voice is delightfully silly stuff…

Newgrounds was one of many dumping grounds for Flash animations, making it easier for friends to share links not only to videos but also free online games—usually in ways that school computer labs didn’t necessarily block, which led kids to devour and share their favorites when teachers weren’t carefully watching students’ screens. And in the case of “All Your Base,” its general lack of vulgarity made it easier to reach kids without drawing parental ire. This wasn’t like the early ’90s Congressional hearings against violent and sexual video games. It was just… weird.

And, gosh, it still is. Yes, this video’s 20th anniversary will likely make you feel old as dirt [indeed it does], but that doesn’t mean the video itself aged badly. There’s still something timeless about both the wackiness and innocence of so many early-Internet pioneers sending up a badly translated game. And in an age where widely disseminated memes so often descend into cruelty or shock value, it’s nice to look back at an age when memes were merely quite stupid.

Back in the day, memes didn’t benefit from centralized services like YouTube and Twitter: “An anniversary for great justice: Remembering “All Your Base” 20 years later.”

See also: “All Your Base Are Belong To Us has turned 20.”

James Gleick

###

As we watch time fly, we might recall that it was on this date in 1986 that the Soviet Union launched the base unit of the Mir Space Station into orbit. Mir was the first modular space station; it was systematically expanded from 1986 to 1996. And while it was slated to last five years, it operated for fifteen– outliving the Soviet Union– after which it was replaced by the International Space Station.

Mir seen from Space Shuttle Endeavour (February 1998)

source

(We might also note that it was on this date in 1962 that John Glenn, in Friendship 7, became the first American to orbit the earth. Yuri Gagarin had become the first person to accomplish this feat when he orbited the Earth in a Soviet Vostok spacecraft on April 12, 1961.)

“We know the past but cannot control it. We control the future but cannot know it.”*…

Readers will know of your correspondent’s fascination with– and admiration for– Claude Shannon

Within engineering and mathematics circles, Shannon is a revered figure. At 21 [in 1937], he published what’s been called the most important master’s thesis of all time, explaining how binary switches could do logic and laying the foundation for all future digital computers. At the age of 32, he published A Mathematical Theory of Communication, which Scientific American called “the Magna Carta of the information age.” Shannon’s masterwork invented the bit, or the objective measurement of information, and explained how digital codes could allow us to compress and send any message with perfect accuracy.

But Shannon wasn’t just a brilliant theoretical mind — he was a remarkably fun, practical, and inventive one as well. There are plenty of mathematicians and engineers who write great papers. There are fewer who, like Shannon, are also jugglers, unicyclists, gadgeteers, first-rate chess players, codebreakers, expert stock pickers, and amateur poets.

Shannon worked on the top-secret transatlantic phone line connecting FDR and Winston Churchill during World War II and co-built what was arguably the world’s first wearable computer. He learned to fly airplanes and played the jazz clarinet. He rigged up a false wall in his house that could rotate with the press of a button, and he once built a gadget whose only purpose when it was turned on was to open up, release a mechanical hand, and turn itself off. Oh, and he once had a photo spread in Vogue.

Think of him as a cross between Albert Einstein and the Dos Equis guy…

From Jimmy Soni (@jimmyasoni), co-author of A Mind At Play: How Claude Shannon Invented the Information Age: “11 Life Lessons From History’s Most Underrated Genius.”

* Claude Shannon

###

As we learn from the best, we might recall that it was on this date in 1946 that an early beneficiary of Shannon’s thinking, the ENIAC (Electronic Numerical Integrator And Computer), was first demonstrated in operation.  (It was announced to the public the following day.) The first general-purpose computer (Turing-complete, digital, and capable of being programmed and re-programmed to solve different problems), ENIAC was begun in 1943, as part of the U.S’s war effort (as a classified military project known as “Project PX”); it was conceived and designed by John Mauchly and Presper Eckert of the University of Pennsylvania, where it was built.  The finished machine, composed of 17,468 electronic vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints, weighed more than 27 tons and occupied a 30 x 50 foot room– in its time the largest single electronic apparatus in the world.  ENIAC’s basic clock speed was 100,000 cycles per second. Today’s home computers have clock speeds of 1,000,000,000 cycles per second.

 source

%d bloggers like this: