(Roughly) Daily

Posts Tagged ‘network

“Arguing that you don’t care about the right to privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say”*…

There’s a depressing sort of symmetry in the fact that our modern paradigms of privacy were developed in response to the proliferation of photography and their exploitation by tabloids. The seminal 1890 Harvard Law Review article The Right to Privacy—which every essay about data privacy is contractually obligated to cite—argued that the right of an individual to object to the publication of photographs ought to be considered part of a general ‘right to be let alone’.

30 years on, privacy is still largely conceived of as an individual thing, wherein we get to make solo decisions about when we want to be left alone and when we’re comfortable being trespassed upon. This principle undergirds the notice-and-consent model of data management, which you might also know as the pavlovian response to click “I agree” on any popup and login screen with little regard for the forty pages of legalese you might be agreeing to.

The thing is, the right to be left alone makes perfect sense when you’re managing information relationships between individuals, where there are generally pretty clear social norms around what constitutes a boundary violation. Reasonable people can and do disagree as to the level of privacy they expect, but if I invite you into my home and you snoop through my bedside table and read my diary, there isn’t much ambiguity about that being an invasion.

But in the age of ✨ networked computing ✨, this individual model of privacy just doesn’t scale anymore. There are too many exponentially intersecting relationships for any of us to keep in our head. It’s no longer just about what we tell a friend or the tax collector or even a journalist. It’s the digital footprint that we often unknowingly leave in our wake every time we interact with something online, and how all of those websites and apps and their shadowy partners talk to each other behind our backs. It’s the cameras in malls tracking our location and sometimes emotions, and it’s the license plate readers compiling a log of our movements.

At a time when governments and companies are increasingly investing in surveillance mechanisms under the guise of security and transparency, that scale is only going to keep growing. Our individual comfort about whether we are left alone is no longer the only, or even the most salient part of the story, and we need to think about privacy as a public good and a collective value.

I like thinking about privacy as being collective, because it feels like a more true reflection of the fact that our lives are made up of relationships, and information about our lives is social and contextual by nature. The fact that I have a sister also indicates that my sister has at least one sibling: me. If I took a DNA test through 23andme I’m not just disclosing information about me but also about everyone that I’m related to, none of whom are able to give consent. The privacy implications for familial DNA are pretty broad: this information might be used to sell or withhold products and services, expose family secrets, or implicate a future as-yet-unborn relative in a crime. I could email 23andme and ask them to delete my records, and they might eventually comply in a month or three. But my present and future relatives wouldn’t be able to do that, or even know that their privacy had been compromised at all.

Even with data that’s less fraught than our genome, our decisions about what we expose to the world have externalities for the people around us. I might think nothing of posting a photo of going out with my friends and mentioning the name of the bar, but I’ve just exposed our physical location to the internet. If one of my friends has had to deal with a stalker in their past, I could’ve put their physical safety at risk. Even if I’m careful to make the post friends-only, the people I trust are not the same as the people my friends trust. In an individual model of privacy, we are only as private as our least private friend.

Amidst the global pandemic, this might sound not dissimilar to public health. When I decide whether to wear a mask in public, that’s partially about how much the mask will protect me from airborne droplets. But it’s also—perhaps more significantly—about protecting everyone else from me.

Data collection isn’t always bad, but it is always risky. Sometimes that’s due to shoddy design and programming or lazy security practices. But even the best engineers often fail to build risk-free systems, by the very nature of systems.

Systems are easier to attack than they are to defend. If you want to defend a system, you have to make sure every part of it is perfectly implemented to guard against any possible vulnerabilities. Oftentimes, trying to defend a system means adding additional components, which just ends up creating more potential weak points. Whereas if you want to attack, all you have to do is find the one weakness that the systems designer missed. (Or, to paraphrase the IRA, you only have to be lucky once.)

This is true of all systems, digital or analog, but the thing that makes computer systems particularly vulnerable is that the same weaknesses can be deployed across millions of devices, in our phones and laptops and watches and toasters and refrigerators and doorbells. When a vulnerability is discovered in one system, an entire class of devices around the world is instantly a potential target, but we still have to go fix them one by one.

This is how the Equifax data leak happened. Equifax used a piece of open source software that had a security flaw in it, the people who work on that software found it and fixed it, and instead of diligently updating their systems Equifax hit the snooze button for four months and let hackers steal hundreds of millions of customer records. And while Equifax is definitely guilty of aforementioned lazy security practices, this incident also illustrates how fragile computer systems are. From the moment this bug was discovered, every server in the world that ran that software was at risk.

What’s worse, in many cases people weren’t even aware that their data was stored with Equifax. If you’re an adult who has had a job or a phone bill or interacted with a bank in the last seven years, your identifying information is collected by Equifax whether you like it or not. The only way to opt out would have been to be among the small percentage of overwhelmingly young, poor, and racialized people who have no credit histories, which significantly limits the scope of their ability to participate in the economy. How do you notice-and-consent your way out of that?

There unfortunately isn’t one weird trick to save democracy, but that doesn’t mean there aren’t lessons we can learn from history to figure out how to protect privacy as a public good. The scale and ubiquity of computers may be unprecedented, but so is the scale of our collective knowledge…

Read the full piece (and you should) for Jenny Zhang‘s (@phirephoenix) compelling case that we should treat– and protect– privacy as a public good, and explanation of how we might do that: “Left alone, together.” TotH to Sentiers.

[image above: source]

* Edward Snowden

###

As we think about each other, we might recall that it was on this date in 1939 that the first government appropriation was made to the support the construction of the Harvard Mark I computer.

Designer Howard Aiken had enlisted IBM as a partner in 1937; company chairman Thomas Watson Sr. personally approved the project and its funding. It was completed in 1944 (and put to work on a set war-related tasks, including calculations– overseen by John von Neumann— for the Manhattan Project). 

The Mark I was the industry’s largest electromechanical calculator… and it was large: 51 feet long, 8 feet high, and 2 feet deep; it weighed about 9,445 pounds  The basic calculating units had to be synchronized and powered mechanically, so they were operated by a 50-foot (15 m) drive shaft coupled to a 5 horsepower electric motor, which served as the main power source and system clock. It could do 3 additions or subtractions in a second; a multiplication took 6 seconds; a division took 15.3 seconds; and a logarithm or a trigonometric function took over a minute… ridiculously slow by today’s standards, but a huge advance in its time.

source

“The tribalizing power of the new electronic media, the way in which they return to us to the unified fields of the old oral cultures, to tribal cohesion and pre-individualist patterns of thought, is little understood”*…

Nokia was dominant in mobile phone sales from 1998 to around 2010. Nokia’s slogan: Connecting people.

It was amazing to connect with people in the late 90s/early 2000s. I don’t think we were lonely exactly. But maybe meeting people was somewhere between an opportunity, something novel, and, yes, a need – suddenly it was possible to find the right person, or the right community.

So, the zeitgeist of the early 2000s.

I ran across a previous zeitgeist in an article about Choose Your Own Adventure books. They appeared and became massively popular at the same time as text adventure computer games, but neither inspired the invention of the other. How? The real answer may lie far deeper in the cultural subconscious … in the zeitgeist of the 1980s.

1980s: you.

2000s: connection.

2020s: ?

Zeitgeists don’t lead and zeitgeists don’t follow.

I think when we spot some kind of macro trend in establishment consumer ads, it’s never going to be about presenting people with something entirely new. To resonate, it has to be familiar – the trajectory that the consumer is already on – but it also has to scratch an itch. The brand wants to be a helpful fellow traveller, if you like.

I wonder what the zeitgeist of the 2020s will be, or is already maybe. What deep human need will be simultaneously a comfort and an aspiration? There should be hints of it in popular culture already. (If I knew how to put my finger on it, I’d be an ad planner.)

If I had to guess then it would be something about belonging.

There was a hint of this in Reddit’s 5 second Super Bowl commercial which went hard on one their communities, r/WallStreetBets, ganging up to bring down hedge funds. Then we’ve got a couple of generations now who grew up with the idea of fandoms, and of course conspiracy theories like QAnon too. If you squint, you can kind of see this in the way Tesla operates: it’s a consumer brand but it’s also a passionate, combative cause.

Belonging to a tribe is about identity and strength, it’s solace and empowerment all at once. And also knowledge, certainty, and trust in an era of complexity, disinfo, and hidden agendas.

Given that backdrop, it’s maybe unsurprising that the trend in software is towards Discord servers and other virtual private neighbourhoods. But how else will this appear? And is it just the beginnings of something else, something bigger?

1980s (you), 2000s (connection). What’s the 2020s zeitgeist?” From Matt Webb (@genmon)

* Marshall McLuhan

###

As we double down on diversity, we might send well-connected birthday greetings to Joseph Carl Robnett Licklider; he was born on this date in 1015. Better known as “J.C,R.” or “Lick,” he was a prominent figure in the development of computing and computer science. He was especially impactful Considered the “Johnny Appleseed” of computing, he planted many of the seeds of computing in the digital age– escpecially via his idea of a universal computer network to easily transfer and retrieve information which his successors developed into the internet.

Robert Taylor, founder of Xerox PARC‘s Computer Science Laboratory and Digital Equipment Corporation‘s Systems Research Center, noted that “most of the significant advances in computer technology—including the work that my group did at Xerox PARC—were simply extrapolations of Lick’s vision. They were not really new visions of their own. So he was really the father of it all.”

source

Written by (Roughly) Daily

March 11, 2021 at 1:01 am

“PLATO was my Alexandria. It was my library, it was the place where I could attach myself to anything.”*…

 

Plato_ist2_terminal

 

One upon a time [in the 60s and 70s], there was a computer network with thousands of users across the world. It featured chat rooms, message boards, multiplayer games, a blog-like newspaper, and accredited distance learning, all piped to flat-panel plasma screens that were also touchscreens. And it wasn’t the internet.

It was PLATO (Programmed Logic for Automatic Teaching Operations), and its original purpose was to harness the power of the still-obscure world of computing as a teaching tool. Developing PLATO required simultaneous quantum leaps in technological sophistication, and it worked—college and high-school students quickly learned how to use it, and also pushed it to do new things.

Despite decades of use at major universities, it all but vanished in the 1980s and from popular memory in the years that followed, a victim of the microcomputer revolution. At its peak, PLATO was surprisingly similar to the modern internet, and it left its DNA in technology we still use today…

The story of the ur-internet: “PLATO.”

* novelist Richard Powers (who was a coder before he turned to literary fiction)

###

As we log on, we might send super birthday greetings to Seymour Roger Cray; he was born on this date in 1925.  An electrical engineer and computer architect, he designed a series of computers that were the fastest in the world for decades, and founded Cray Research which built many of these machines– effectively creating the “supercomputer” industry and earning the honorific “father of supercomputing.”

Seymour_Cray

With a Cray-1

source

 

Written by (Roughly) Daily

September 28, 2019 at 1:01 am

“The study of man is the study of his extensions”*…

 

magic-lantern_1_md

The magic lantern was invented in the 1600’s, probably by Christiaan Huygens, a Dutch scientist. It was the earliest form of slide projector and has a long and fascinating history. The first magic lanterns were illuminated by candles, but as technology evolved they were lit by increasingly powerful means.

The name “magic lantern” comes from the experience of the early audiences who saw devils and angels mysteriously appear on the wall, as if by magic. Even in the earliest period, performances contained images that moved—created with moving pieces of glass.

By the 18th century the lantern was a common form of entertainment and education in Europe. The earliest known “lanthorn show” in the U. S. was in Salem, Massachusetts, on December 3, 1743, “for the Entertainment of the Curious.” But the source of light for lanterns in this period—usually oil lamps—was still weak, and as a consequence the audiences were small.

In the mid 19th century, two new forms of illumination were developed which led to an explosion of lantern use. “Limelight” was created by heating a piece of limestone in burning gas until it became incandescent. It was dangerous, but produced a light that was strong enough to project an image before thousands of people, leading to large shows by professional showmen…

All about the entertainment sensation of its time at the web site of The Magic Lantern Society.  [TotH to friend and colleague RW]

And for a peek at the transition from the static images of the magic lantern to film-as-we-know-it, see “Putting Magic in the Magic Lantern.”

[image above: source]

Edward T. Hall, Beyond Culture

###

We we watch with wonder, we might recall that it was on this date in 1926 that The NBC Radio Network, the first network in the U.S., was launched.  Carl Schlegel of the Metropolitan Opera opened the four-hour inaugural broadcast, which also featured Will Rogers and Mary Garden; it included a remote link from KYW in Chicago and was carried by twenty-two eastern and midwestern stations, located as far west as WDAF in Kansas City, Missouri.

NBC has been formed from assets already held by its parent, Radio Corporation of America (RCA) and other assets acquired from AT&T (which had been, up to that point, a pioneer in radio technology).  Crucially, as part of the reassignment permissions granted by the government, NBC was allowed to sell advertising.

NBC’s network grew quickly; two months later, on January 1, 1927, it was split into the Red and Blue networks.  And it quickly attracted competition:  the Columbia Broadcasting System (CBS) in 1927 and the Mutual Broadcasting System in 1934.  In 1942 the government required NBC to divest one of its networks; it sold off NBC Blue, which became The American Broadcasting Company (ABC).

200px-NBC_Red_Network source

 

 

Written by (Roughly) Daily

November 15, 2018 at 1:01 am

“Every day sees humanity more victorious in the struggle with space and time”*…

 

Contact: A hundred years before iconic figures like Bill Gates and Steve Jobs permeated our lives, 60 years before Marshall McLuhan proclaimed media to be “the extensions of man,” an Irish-Italian inventor laid the foundation of the communication explosion of the 21st century. Guglielmo Marconi was arguably the first truly global figure in modern communication. Not only was he the first to communicate globally, he was the first to think globally about communication. Marconi may not have been the greatest inventor of his time, but more than anyone else, he brought about a fundamental shift in the way we communicate.

Today’s globally networked media and communication system has its origins in the 19th century, when, for the first time, messages were sent electronically across great distances. The telegraph, the telephone, and radio were the obvious precursors of the Internet, iPods, and mobile phones. What made the link from then to now was the development of wireless communication. Marconi was the first to develop and perfect a practical system for wireless, using the recently-discovered “air waves” that make up the electromagnetic spectrum…

An excerpt from Marconi: The Man Who Networked the World by Marc Raboy. Oxford University Press.  Via “How Marconi Gave Us the Wireless World.”

* Guglielmo Marconi

###

As we tweak the dial, we might recall that, thanks to a handwritten note by illustrator Heinrich Cremer, we know that the final binding of the Gutenberg Bible took place on this date in 1456.

 source

 

Written by (Roughly) Daily

August 24, 2016 at 1:01 am

%d bloggers like this: