Posts Tagged ‘surveillance’
“A Room of One’s Own”*…
Interiors of real living and working spaces from an unusual perspective. It seems as if someone has lifted the ceiling and replaced it with a scanner. In fact, the photographs consist of up to a hundred individual shots taken with a remote control and then digitally assembled. One associates doll’s houses and at the same time thinks of surveillance…
Menno Aden‘s photos evoke the spirit of our “Surveillance Capitalist” age: “Room Portraits.”
* An extended essay by Virginia Wolff
###
As we duck and cover, we might recall that it was on this date in 1954 that Rosemary Clooney hit the top of the U.S. pop chart with “This Ole House,” which soon topped the British chart as well. Written earlier that year by Stuart Hamblen, it was a #1 hit again in the U.K. in 1981 for Shakin’ Stevens.
“I never said, ‘I want to be alone.’ I only said ‘I want to be let alone!’ There is all the difference.”*…
Someone observing her could assemble in forensic detail her social and familial connections, her struggles and interests, and her beliefs and commitments. From Amazon purchases and Kindle highlights, from purchase records linked with her loyalty cards at the drugstore and the supermarket, from Gmail metadata and chat logs, from search history and checkout records from the public library, from Netflix-streamed movies, and from activity on Facebook and Twitter, dating sites, and other social networks, a very specific and personal narrative is clear.
If the apparatus of total surveillance that we have described here were deliberate, centralized, and explicit, a Big Brother machine toggling between cameras, it would demand revolt, and we could conceive of a life outside the totalitarian microscope. But if we are nearly as observed and documented as any person in history, our situation is a prison that, although it has no walls, bars, or wardens, is difficult to escape.
Which brings us back to the problem of “opting out.” For all the dramatic language about prisons and panopticons, the sorts of data collection we describe here are, in democratic countries, still theoretically voluntary. But the costs of refusal are high and getting higher: A life lived in social isolation means living far from centers of business and commerce, without access to many forms of credit, insurance, or other significant financial instruments, not to mention the minor inconveniences and disadvantages — long waits at road toll cash lines, higher prices at grocery stores, inferior seating on airline flights.
It isn’t possible for everyone to live on principle; as a practical matter, many of us must make compromises in asymmetrical relationships, without the control or consent for which we might wish. In those situations — everyday 21st-century life — there are still ways to carve out spaces of resistance, counterargument, and autonomy.
We are surrounded by examples of obfuscation that we do not yet think of under that name. Lawyers engage in overdisclosure, sending mountains of vaguely related client documents in hopes of burying a pertinent detail. Teenagers on social media — surveilled by their parents — will conceal a meaningful communication to a friend in a throwaway line or a song title surrounded by banal chatter. Literature and history provide many instances of “collective names,” where a population took a single identifier to make attributing any action or identity to a particular person impossible, from the fictional “I am Spartacus” to the real “Poor Conrad” and “Captain Swing” in prior centuries — and “Anonymous,” of course, in ours…
There is real utility in an obfuscation approach, whether that utility lies in bolstering an existing strong privacy system, in covering up some specific action, in making things marginally harder for an adversary, or even in the “mere gesture” of registering our discontent and refusal. After all, those who know about us have power over us. They can deny us employment, deprive us of credit, restrict our movements, refuse us shelter, membership, or education, manipulate our thinking, suppress our autonomy, and limit our access to the good life…
As Finn Brunton and Helen Nissenbaum argue in their new book Obfuscation: A User’s Guide for Privacy and Protest, those who know about us have power over us; obfuscation may be our best digital weapon: “The Fantasy of Opting Out.”
* Greta Garbo
###
As we ponder privacy, we might recall that it was on this date in 1536 that William Tyndale was strangled then burned at the stake for heresy in Antwerp. An English scholar and leading Protestant reformer, Tyndale effectively replaced Wycliffe’s Old English translation of the Bible with a vernacular version in what we now call Early Modern English (as also used, for instance, by Shakespeare). Tyndale’s translation was first English Bible to take advantage of the printing press, and first of the new English Bibles of the Reformation. Consequently, when it first went on sale in London, authorities gathered up all the copies they could find and burned them. But after England went Protestant, it received official approval and ultimately became the basis of the King James Version.
Ironically, Tyndale incurred Henry VIII’s wrath after the King’s “conversion” to Protestantism, by writing a pamphlet decrying Henry’s divorce from Catherine of Aragon. Tyndale moved to Europe, where he continued to advocate Protestant reform, ultimately running afoul of the Holy Roman Empire, which sentenced him to his death.
“Outward show is a wonderful perverter of the reason”*…
Humans have long hungered for a short-hand to help in understanding and managing other humans. From phrenology to the Myers-Briggs Test, we’ve tried dozens of short-cuts… and tended to find that at best they weren’t actually very helpful; at worst, they were reinforcing of stereotypes that were inaccurate, and so led to results that were unfair and ineffective. Still, the quest continues– these days powered by artificial intelligence. What could go wrong?…
Could a program detect potential terrorists by reading their facial expressions and behavior? This was the hypothesis put to the test by the US Transportation Security Administration (TSA) in 2003, as it began testing a new surveillance program called the Screening of Passengers by Observation Techniques program, or Spot for short.
While developing the program, they consulted Paul Ekman, emeritus professor of psychology at the University of California, San Francisco. Decades earlier, Ekman had developed a method to identify minute facial expressions and map them on to corresponding emotions. This method was used to train “behavior detection officers” to scan faces for signs of deception.
But when the program was rolled out in 2007, it was beset with problems. Officers were referring passengers for interrogation more or less at random, and the small number of arrests that came about were on charges unrelated to terrorism. Even more concerning was the fact that the program was allegedly used to justify racial profiling.
Ekman tried to distance himself from Spot, claiming his method was being misapplied. But others suggested that the program’s failure was due to an outdated scientific theory that underpinned Ekman’s method; namely, that emotions can be deduced objectively through analysis of the face.
In recent years, technology companies have started using Ekman’s method to train algorithms to detect emotion from facial expressions. Some developers claim that automatic emotion detection systems will not only be better than humans at discovering true emotions by analyzing the face, but that these algorithms will become attuned to our innermost feelings, vastly improving interaction with our devices.
But many experts studying the science of emotion are concerned that these algorithms will fail once again, making high-stakes decisions about our lives based on faulty science…
“Emotion detection” has grown from a research project to a $20bn industry; learn more about why that’s a cause for concern: “Don’t look now: why you should be worried about machines reading your emotions.”
Meditations
###
As we insist on the individual, we might recall that it was on this date in 1989 that Tim Berners-Lee submitted a proposal to CERN for developing a new way of linking and sharing information over the Internet.
It was the first time Berners-Lee proposed a system that would ultimately become the World Wide Web; but his proposal was basically a relatively vague request to research the details and feasibility of such a system. He later submitted a proposal on November 12, 1990 that much more directly detailed the actual implementation of the World Wide Web.