(Roughly) Daily

Posts Tagged ‘computers

“A better world won’t come about simply because we use data; data has its dark underside.”*…

 

Data

 

Data isn’t the new oil, it’s the new CO2. It’s a common trope in the data/tech field to say that “data is the new oil”. The basic idea being – it’s a new resource that is being extracted, it is valuable, and is a raw product that fuels other industries. But it also implies that data in inherently valuable in and of itself and that “my data” is valuable, a resource that I really should tap in to.

In reality, we are more impacted by other people’s data (with whom we are grouped) than we are by data about us. As I have written in the MIT Technology Review – “even if you deny consent to ‘your’ data being used, an organisation can use data about other people to make statistical extrapolations that affect you.” We are bound by other people’s consent. Our own consent (or lack thereof) is becoming increasingly irrelevant. We won’t solve the societal problems pervasive data surveillance is causing by rushing through online consent forms. If you see data as CO2, it becomes clearer that its impacts are societal not solely individual. My neighbour’s car emissions, the emissions from a factory on a different continent, impact me more than my own emissions or lack thereof. This isn’t to abdicate individual responsibility or harm. It’s adding a new lens that we too often miss entirely.

We should not endlessly be defending arguments along the lines that “people choose to willingly give up their freedom in exchange for free stuff online”. The argument is flawed for two reasons. First the reason that is usually given – people have no choice but to consent in order to access the service, so consent is manufactured.  We are not exercising choice in providing data but rather resigned to the fact that they have no choice in the matter.

The second, less well known but just as powerful, argument is that we are not only bound by other people’s data; we are bound by other people’s consent.  In an era of machine learning-driven group profiling, this effectively renders my denial of consent meaningless. Even if I withhold consent, say I refuse to use Facebook or Twitter or Amazon, the fact that everyone around me has joined means there are just as many data points about me to target and surveil. The issue is systemic, it is not one where a lone individual can make a choice and opt out of the system. We perpetuate this myth by talking about data as our own individual “oil”, ready to sell to the highest bidder. In reality I have little control over this supposed resource which acts more like an atmospheric pollutant, impacting me and others in myriads of indirect ways. There are more relations – direct and indirect – between data related to me, data about me, data inferred about me via others than I can possibly imagine, let alone control with the tools we have at our disposal today.

Because of this, we need a social, systemic approach to deal with our data emissions. An environmental approach to data rights as I’ve argued previously. But first let’s all admit that the line of inquiry defending pervasive surveillance in the name of “individual freedom” and individual consent gets us nowhere closer to understanding the threats we are facing.

Martin Tisné argues for an “environmental” approach to data rights: “Data isn’t the new oil, it’s the new CO2.”

Lest one think that we couldn’t/shouldn’t have seen this (and related issues like over dependence on algorithms, the digital divide, et al.) coming, see also Paul Baran‘s prescient 1968 essay, “On the Future Computer Era,” one of the last pieces he did at RAND, before co-leading the spin-off of The Institute for the Future.

* Mike Loukides, Ethics and Data Science

###

As we ponder privacy, we might recall that it was on this date in 1981 that IBM released IBM model number 5150– AKA the IBM PC– the original version and progenitor of the IBM PC compatible hardware platform. Since the machine was based on open architecture, within a short time of its introduction, third-party suppliers of peripheral devices, expansion cards, and software proliferated; the influence of the IBM PC on the personal computer market was substantial in standardizing a platform for personal computers (and creating a market for Microsoft’s operating system– first PC DOS, then Windows– on which the PC platform ran).  “IBM compatible” became an important criterion for sales growth; after the 1980s, only the Apple Macintosh family kept a significant share of the microcomputer market without compatibility with the IBM personal computer.

IBM PC source

 

Written by LW

August 12, 2019 at 1:01 am

“Big Data is like teenage sex: everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it”*…

 

a-day-in-data-1200

 

You’ve probably heard of kilobytes, megabytes, gigabytes, or even terabytes.

These data units are common everyday amounts that the average person may run into. Units this size may be big enough to quantify the amount of data sent in an email attachment, or the data stored on a hard drive, for example.

In the coming years, however, these common units will begin to seem more quaint – that’s because the entire digital universe is expected to reach 44 zettabytes by 2020.

If this number is correct, it will mean there are 40 times more bytes than there are stars in the observable universe…

The stuff of dreams, the stuff of nightmares: “How Much Data is Generated Each Day?

* Dan Ariely

###

As we revel in really, really big numbers, we might spare a thought for Edgar Frank “Ted” Codd; he died on this date in 2003.  A distinguished computer scientist who did important work on cellular automata, he is best remembered as the father of computer databases– as the person who laid the foundation for for relational databases, for storing and retrieving information in computer records.

150px-Edgar_F_Coddsource

 

Written by LW

April 18, 2019 at 1:01 am

“The most fruitful and natural exercise for our minds is, in my opinion, conversation”*…

 

In this age of Alexa and Siri, it’s worth recalling the (long) history of human-machine conversation.  At MIT in 1966 an experiment with an early form of artificial intelligence anticipated just how close we might grow with our machines…

Client: Men are all alike.

ELIZA (a computer): In what way

Client: They’re always bugging us about something or other.

ELIZA: Can you think of a specific example

Client: Well, my boyfriend made me come here.

ELIZA: Your boyfriend made you come here

Client: He says I’m depressed much of the time.

ELIZA: I am sorry to hear you are depressed

Client: It’s true. I am unhappy.

ELIZA: Do you think coming here will help you not to be unhappy

Client: I need some help, that much seems certain.

– Joseph Weizenbaum, from a transcription of “Computational Linguistics.”

Named after George Bernard Shaw’s ingénue in Pygmalion, the ELIZA program operated on the Project MAC computer system at MIT and made natural language exchange possible between man and machine. ELIZA identified key words and phrases and responded with rote answers that simulated a conversation.

Talking Cure,” via Lapham’s Quarterly.

* Michel de Montaigne, The Essays

###

As we lie back on the couch, we might note that it was on this date in 1928 that the Enigma Machine encoded its first message.

A simple German machine the size of a portable typewriter, ENIGMA allowed for security in communications by a process in which typed letters were replaced by a cipher text displayed on illuminated lamps. The cipher was symmetrical so entering the cipher text into another ENIGMA reproduced the original message. Security was provided by a set of rotor wheels and a series of patch cables whose arrangement was agreed upon previously.

ENIGMA was used extensively by the German military during World War II to transmit battle plans and other secret information. By December of 1941, however, British codebreakers managed to decipher the code, allowing them to routinely read most ENIGMA traffic.

[source- Computer History Museum]

  source

 

Written by LW

July 15, 2016 at 1:01 am

Phreaking out…

Cover of the Spring 2012 issue of 2600

source

In preparation for “treat-ing” tonight’s parade of freaks, one might pause to pay respects to 2600: The Hacker Quarterly, an American publication that specializes in publishing technical information on a variety of subjects including telephone switching systems, Internet protocols and services, as well as general news concerning the computer underground.  The magazine’s moniker comes from the “phreaker” discovery (by John “Cap’n Crunch” Draper and friends in the 1960s) that the transmission of a 2600 hertz tone (which could be produced perfectly with a plastic toy whistle given away free with Cap’n Crunch cereal) over a long-distance trunk connection gained access to “operator mode” and allowed the user to explore aspects of the telephone system that were not otherwise accessible… like free long distance calls.  (The seed money for Apple was in part raised by the two Steves’ sale of “phreaking boxes” designed to do just this.)

2600 has become a journal-of-record for “Grey Hat” hackers– tech explorers concerned to push past the limits inherent to the design of a given technological device or application (as opposed to White Hats, who are ideologically motivated to do good, or Black Hats, who pursue selfish– often illegal– gain).  So its current editorial focus is largely on the web and its devices, increasingly on mobile implementations and application.

But 2600 honors its roots, among other ways, by maintaining a gallery of photos of payphones around the world; for example…

Peshlawar, Pakistan

Moscow, Russia (The payphones only accept one ruble coins, an obsolete denomination)

###

As we wax nostalgic, we might send illuminating birthday greetings to Narinder Singh Kapany; he was born on this date in 1926.  While growing up in Dehradun in northern India, a teacher informed him that light only traveled in a straight line.  He took this as a challenge and made the study of light his life work, initially at Imperial College, London.  In January 1954, Nature published his report of successfully transmitting images through fiber bundles– and Dr. Kapany became the father of fiber optics (a name he coined).  Dr. Kapany ultimately migrated to the U.S., where he continued to invent (he holds over 100 patents), taught, started successful companies, and became a philanthropist.  Fortune named him one of seven ‘Unsung Heroes’ in their “Businessmen of the Century” issue (November 22, 1999).  It was, of course, the implementation of Dr. Kapany’s work that rendered “phreaking” moot.

 source

Happy Halloween!

 from the NY Public Library’s Flickr set of Halloween cards

Written by LW

October 31, 2013 at 1:01 am

Geek Humor…

 

Readers who have wondered about the utility of Quora can discover at least one example here

  • yo momma’s so mean, she has no standard deviation.
  • A infectious disease walks into a bar. The bartender says “We don’t serve infectious diseases.” The infectious disease says “Well, you’re not a very good host.”
  • A neutrino walks into a bar. The bartender says “We don’t serve neutrinos in this bar.” The neutrino says “Hey, I was just passing through.”
  • A wife asks her husband, a software engineer: “Could you please go shopping for me and buy one carton of milk, and if they have eggs, get 6!” A short time later the husband comes back with 6 cartons of milk. The wife asks him, “Why the hell did you buy 6 cartons of milk?” He replied, “They had eggs.”
  • The Higgs boson walks into a church. The priest says “We don’t allow Higgs boson in here. The Higgs boson says, “But without me how can you have mass?”
  • Two men walk into a bar and the first one says, ” I would like some H20.” Then  the second man says, “That sounds good I will have H20 too.” Then the second man died.

Many more at “What are some funny nerd jokes?”

###

As we titter technologically, we might recall that it was on this date in 1988 that the first “computer worm” was unleashed.  The phrase was coined in in John Brunner’s 1975 novel, The Shockwave Rider; but the first worm was written and released thirteen years later by Robert Tappan Morris, a Cornell University computer science graduate student.  Morris also has the distinction of being the first person tried and convicted under the 1986 federal Computer Fraud and Abuse Act.

 source

 

Written by LW

November 2, 2012 at 1:01 am

%d bloggers like this: