(Roughly) Daily

Posts Tagged ‘Technology

“Our major obligation is not to mistake slogans for solutions”*…

 

luo-slownews

 

In 2008, the Columbia Journalism Review published an article with the headline “Overload!,” which examined news fatigue in “an age of too much information.” When “Overload!” was published, BlackBerrys still dominated the smartphone market, push notifications hadn’t yet to come to the iPhone, retweets weren’t built into Twitter, and BuzzFeed News did not exist. Looking back, the idea of suffering from information overload in 2008 seems almost quaint. Now, more than a decade later, a fresh reckoning seems to be upon us. Last year, Tim Cook, the chief executive officer of Apple, unveiled a new iPhone feature, Screen Time, which allows users to track their phone activity. During an interview at a Fortune conference, Cook said that he was monitoring his own usage and had “slashed” the number of notifications he receives. “I think it has become clear to all of us that some of us are spending too much time on our devices,” Cook said.

It is worth considering how news organizations have contributed to the problems Newport and Cook describe. Media outlets have been reduced to fighting over a shrinking share of our attention online; as Facebook, Google, and other tech platforms have come to monopolize our digital lives, news organizations have had to assume a subsidiary role, relying on those sites for traffic. That dependence exerts a powerful influence on which stories are pursued, how they’re presented, and the speed and volume at which they’re turned out…

A central purpose of journalism is the creation of an informed citizenry. And yet—especially in an environment of free-floating, ambient news—it’s not entirely clear what it means to be informed: “The Urgent Quest for Slower, Better News.”

* Edward R. Murrow

###

As we break news, we might recall that it was on this date in 1704 that the first issue of The Boston News-Letter was published.  Heavily subsidized by the British government, with a limited circulation, it was the first continuously-published newspaper in North America.  The colonies’ first newspaper was (the rather more editorially-independent)  Publick Occurrences Both Forreign and Domestick, which published its first and only issue on September 25, 1690.)

440px-Boston_News-Letter_(first_issue)source

 

Written by LW

April 24, 2019 at 1:01 am

“Big Data is like teenage sex: everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it”*…

 

a-day-in-data-1200

 

You’ve probably heard of kilobytes, megabytes, gigabytes, or even terabytes.

These data units are common everyday amounts that the average person may run into. Units this size may be big enough to quantify the amount of data sent in an email attachment, or the data stored on a hard drive, for example.

In the coming years, however, these common units will begin to seem more quaint – that’s because the entire digital universe is expected to reach 44 zettabytes by 2020.

If this number is correct, it will mean there are 40 times more bytes than there are stars in the observable universe…

The stuff of dreams, the stuff of nightmares: “How Much Data is Generated Each Day?

* Dan Ariely

###

As we revel in really, really big numbers, we might spare a thought for Edgar Frank “Ted” Codd; he died on this date in 2003.  A distinguished computer scientist who did important work on cellular automata, he is best remembered as the father of computer databases– as the person who laid the foundation for for relational databases, for storing and retrieving information in computer records.

150px-Edgar_F_Coddsource

 

Written by LW

April 18, 2019 at 1:01 am

“Status is welcome, agreeable, pleasant, and hard to obtain in the world”*…

 

fur-300x198

 

“It is a truth universally acknowledged, that a person in possession of little fortune, must be in want of more social capital.”

So wrote Jane Austen, or she would have, I think, if she were chronicling our current age (instead we have Taylor Lorenz, and thank goodness for that).

Let’s begin with two principles:

  • People are status-seeking monkeys*
  • People seek out the most efficient path to maximizing social capital

I begin with these two observations of human nature because few would dispute them, yet I seldom see social networks, some of the largest and fastest-growing companies in the history of the world, analyzed on the dimension of status or social capital.

It’s in part a measurement issue. Numbers lend an air of legitimacy and credibility. We have longstanding ways to denominate and measure financial capital and its flows. Entire websites, sections of newspapers, and a ton of institutions report with precision on the prices and movements of money.

We have no such methods for measuring the values and movement of social capital, at least not with anywhere near the accuracy or precision. The body of research feels both broad and yet meager. If we had better measures besides user counts, this piece and many others would be full of charts and graphs that added a sense of intellectual heft to the analysis. There would be some annual presentation called the State of Social akin to Meeker’s Internet Trends Report, or perhaps it would be a fifty page sub-section of her annual report.

Despite this, most of the social media networks we study generate much more social capital than actual financial capital, especially in their early stages; almost all such companies have internalized one of the popular truisms of Silicon Valley, that in the early days, companies should postpone revenue generation in favor of rapid network scaling. Social capital has much to say about why social networks lose heat, stall out, and sometimes disappear altogether. And, while we may not be able to quantify social capital, as highly attuned social creatures, we can feel it.

Social capital is, in many ways, a leading indicator of financial capital, and so its nature bears greater scrutiny. Not only is it good investment or business practice, but analyzing social capital dynamics can help to explain all sorts of online behavior that would otherwise seem irrational.

In the past few years, much progress has been made analyzing Software as a Service (SaaS) businesses. Not as much has been made on social networks. Analysis of social networks still strikes me as being like economic growth theory long before Paul Romer’s paper on endogenous technological change. However, we can start to demystify social networks if we also think of them as SaaS businesses, but instead of software, they provide status. This post is a deep dive into what I refer to as Status as a Service (StaaS) businesses…

Eugene Wei (of Amazon, Hulu, and Flipboard, among other tech successes) on the implications of our hunger for recognition and rank: “Status as a Service (StaaS).”

Pair with: “Understanding Tradeoffs (pt. 2): Breaking the Altruism vs. Capitalism Dichotomy.”

[Image above: source]

* Buddha [Ittha Sutta, AN 5.43]

###

As we contemplate our craving, we might recall that it was on this date in 1845 that a method for manufacturing elastic (rubber) bands was patented in Britain by Stephen Perry and and Thomas Barnabas Daft of London (G.B. No. 13880/1845).

In the early 19th century, sailors had brought home items made by Central and South American natives from the sap of rubber trees, including footwear, garments and bottles.  Around 1820, a Londoner named Thomas Hancock sliced up one of the bottles to create garters and waistbands. By 1843, he had secured patent rights from Charles Macintosh for vulcanized India rubber.  (Vulcanization made rubber stable and retain its elasticity.)  Stephen Perry, owner of Messrs Perry and Co,. patented the use of India rubber for use as springs in bands, belts, etc., and (with Daft) also the manufacture of elastic bands by slicing suitable sizes of vulcanized India rubber tube.  The bands were lightly scented to mask the smell of the treated rubber.

 source

 

Written by LW

March 17, 2019 at 1:01 am

“Outward show is a wonderful perverter of the reason”*…

 

facial analysis

Humans have long hungered for a short-hand to help in understanding and managing other humans.  From phrenology to the Myers-Briggs Test, we’ve tried dozens of short-cuts… and tended to find that at best they weren’t actually very helpful; at worst, they were reinforcing of stereotypes that were inaccurate, and so led to results that were unfair and ineffective.  Still, the quest continues– these days powered by artificial intelligence.  What could go wrong?…

Could a program detect potential terrorists by reading their facial expressions and behavior? This was the hypothesis put to the test by the US Transportation Security Administration (TSA) in 2003, as it began testing a new surveillance program called the Screening of Passengers by Observation Techniques program, or Spot for short.

While developing the program, they consulted Paul Ekman, emeritus professor of psychology at the University of California, San Francisco. Decades earlier, Ekman had developed a method to identify minute facial expressions and map them on to corresponding emotions. This method was used to train “behavior detection officers” to scan faces for signs of deception.

But when the program was rolled out in 2007, it was beset with problems. Officers were referring passengers for interrogation more or less at random, and the small number of arrests that came about were on charges unrelated to terrorism. Even more concerning was the fact that the program was allegedly used to justify racial profiling.

Ekman tried to distance himself from Spot, claiming his method was being misapplied. But others suggested that the program’s failure was due to an outdated scientific theory that underpinned Ekman’s method; namely, that emotions can be deduced objectively through analysis of the face.

In recent years, technology companies have started using Ekman’s method to train algorithms to detect emotion from facial expressions. Some developers claim that automatic emotion detection systems will not only be better than humans at discovering true emotions by analyzing the face, but that these algorithms will become attuned to our innermost feelings, vastly improving interaction with our devices.

But many experts studying the science of emotion are concerned that these algorithms will fail once again, making high-stakes decisions about our lives based on faulty science…

“Emotion detection” has grown from a research project to a $20bn industry; learn more about why that’s a cause for concern: “Don’t look now: why you should be worried about machines reading your emotions.”

* Marcus Aurelius, Meditations

###

As we insist on the individual, we might recall that it was on this date in 1989 that Tim Berners-Lee submitted a proposal to CERN for developing a new way of linking and sharing information over the Internet.

It was the first time Berners-Lee proposed a system that would ultimately become the World Wide Web; but his proposal was basically a relatively vague request to research the details and feasibility of such a system.  He later submitted a proposal on November 12, 1990 that much more directly detailed the actual implementation of the World Wide Web.

web25-significant-white-300x248 source

 

“To paraphrase several sages: Nobody can think and hit someone at the same time”*…

 

Beast within

“Stranded on the Island of Circe” by Paul Reid

 

What was the driving force that made us human, akin to but separate from other apes and our evolutionary cousins such as the Neanderthals? In The Goodness Paradox, the anthropologist Richard Wrangham approvingly quotes Frederick the Great in pointing to “the wild beast” within each man: our nature, he argues, is rooted in an animal violence that morphed over time to become uniquely human. When male human ancestors began to plot together to execute aggressive men in their communities, indeed to carry out such killings through what Wrangham calls “coalitionary proactive aggression”, they were launched towards full humanity…

At some point after the evolutionary split from the non-human ape lineage – probably around 300,000 years ago, Wrangham thinks – our male ancestors began to do what the chimpanzees could not: plot together to execute aggressive males in their own social groups. How do we know this? Because we see evidence of “the domestication syndrome” under way in our ancestors at this time, indicating that they were becoming less in thrall to reactive aggression…

During human evolution, of course, no other more dominant species controlled the process: instead, we domesticated ourselves by eliminating the most aggressive males in our social groups. Our bodies did signal what was happening. Around 315,000 years ago, for example, “the first glimmerings of the smaller face and reduced brow ridge [compared to earlier human ancestors] that signal the evolution of Homo sapiens” began to show up. Sex differences in the skeleton soon began to diminish. Our species was set apart from all other human-like ones, including the Neanderthals, who did not self-domesticate…

How the human species domesticated itself: “Wild beast within.”

* Susan Sontag, Regarding the Pain of Others

###

As we take it easy. we might recall that it was on this date in 1836 that Samuel Colt and a group of financial backers chartered that Patent Arms Manufacturing Company of Paterson, New Jersey, a company formed to produce what became the first commercially-successful revolvers.  The revolver was pioneered by other inventors; Colt’s great contribution was the use of interchangeable parts.  He envisioned that all the parts of every Colt gun would be be interchangeable and made by machine, to be assembled later by hand– that’s to say, his goal, later realized, was an assembly line.

220px-Samuel_Colt_engraving_by_John_Chester_Buttre,_c1855 source

 

“By far, the greatest danger of Artificial Intelligence is that people conclude too early that they understand it”*…

 

robit writer

 

Recently, OpenAI announced its latest breakthrough, GPT-2, a language model that can write essays to a prompt, answer questions, and summarize longer works… sufficiently successfully that OpenAI has said that it’s too dangerous to release the code (lest it result in “deepfake news” or other misleading mischief).

Scott Alexander contemplates the results.  His conclusion:

a brain running at 5% capacity is about as good as the best AI that the brightest geniuses working in the best-equipped laboratories in the greatest country in the world are able to produce in 2019. But:

We believe this project is the first step in the direction of developing large NLP systems without task-specific training data. That is, we are developing a machine language system in the generative style with no explicit rules for producing text. We hope for future collaborations between computer scientists, linguists, and machine learning researchers.

A boring sentiment from an interesting source: the AI wrote that when asked to describe itself. We live in interesting times.

His complete post, eminently worthy of reading in full: “Do Neural Nets Dream of Electric Hobbits?

[image above, and another account of OpenAI’s creation: “OpenAI says its new robo-writer is too dangerous for public release“]

* Eliezer Yudkowsky

###

As we take the Turing Test, we might send elegantly-designed birthday greetings to Steve Jobs; he was born on this date in 1955.  While he is surely well-known to every reader here, let us note for the record that he was was instrumental in developing the Macintosh, the computer that took Apple to unprecedented levels of success.  After leaving the company he started with Steve Wozniak, Jobs continued his personal computer development at his NeXT Inc.  In 1997, Jobs returned to Apple to lead the company into a new era based on NeXT technologies and consumer electronics.  Some of Jobs’ achievements in this new era include the iMac, the iPhone, the iTunes music store, the iPod, and the iPad.  Under Jobs’ leadership Apple was at one time the world’s most valuable company. (And, of course, he bought Pixar from George Lucas, and oversaw both its rise to animation dominance and its sale to Disney– as a product of which Jobs became Disney’s largest single shareholder.)

Jobs source

 

Written by LW

February 24, 2019 at 1:01 am

“It is likely that libraries will carry on and survive, as long as we persist in lending words to the world that surrounds us, and storing them for future readers”*…

 

library

 

Many visions of the future lie buried in the past. One such future was outlined by the American librarian Charles Ammi Cutter in his essay “The Buffalo Public Library in 1983”, written a century before in 1883.

Cutter’s fantasy, at times dry and descriptive, is also wonderfully precise:

The [library], when complete, was to consist of two parts, the first a central store, 150 feet square, a compact mass of shelves and passageways, lighted from the ends, but neither from sides nor top; the second an outer rim of rooms 20 feet wide, lighted from the four streets. In front and rear the rim was to contain special libraries, reading-rooms, and work-rooms; on the sides, the art-galleries. The central portion was a gridiron of stacks, running from front to rear, each stack 2 feet wide, and separated from its neighbor by a passage of 3 feet. Horizontally, the stack was divided by floors into 8 stories, each 8 feet high, giving a little over 7 feet of shelf-room, the highest shelf being so low that no book was beyond the reach of the hand. Each reading-room, 16 feet high, corresponded to two stories of the stack, from which it was separated in winter by glass doors.

The imagined structure allows for a vast accumulation of books:

We have now room for over 500,000 volumes in connection with each of the four reading-rooms, or 4,000,000 for the whole building when completed.

If his vision for Buffalo Public Library might be considered fairly modest from a technological point of view, when casting his net a little wider to consider a future National Library, one which “can afford any luxury”, things get a little more inventive.

[T]hey have an arrangement that brings your book from the shelf to your desk. You have only to touch the keys that correspond to the letters of the book-mark, adding the number of your desk, and the book is taken off the shelf by a pair of nippers and laid in a little car, which immediately finds its way to you. The whole thing is automatic and very ingenious…

But for Buffalo book delivery is a cheaper, simpler, and perhaps less noisy, affair.

…for my part I much prefer our pages with their smart uniforms and noiseless steps. They wear slippers, the passages are all covered with a noiseless and dustless covering, they go the length of the hall in a passage-way screened off from the desk-room so that they are seen only when they leave the stack to cross the hall towards any desk. As that is only 20 feet wide, the interruption to study is nothing.

Cutter’s fantasy might appear fairly mundane, born out of the fairly (stereo)typical neuroses of a librarian: in the prevention of all noise (through the wearing of slippers), the halting of the spread of illness (through good ventilation), and the disorder of the collection (through technological innovations)…

Far from a wild utopian dream, today Cutter’s library of the future appears basic: there will be books and there will be clean air and there will be good lighting. One wonders what Cutter might make of the library today, in which the most basic dream remains perhaps the most radical: for them to remain in our lives, free and open, clean and bright.

More at the original, in Public Domain Review: “The Library of the Future: A Vision of 1983 from 1883.”  Read Cutter’s essay in its original at the Internet Archive.

Pair with “Libraries of the future are going to change in some unexpected ways,” in which IFTF Research Director (and Boing Boing co-founder) David Pescovitz describes a very different future from Cutter’s, and from which the image above was sourced.

* Alberto Manguel, The Library at Night

###

As we browse in bliss, we might recall that it was on this date in 1946 that the most famous early computer– the ENIAC (Electronic Numerical Integrator And Computer)– was dedicated.  The first general-purpose computer (Turing-complete, digital, and capable of being programmed and re-programmed to solve different problems), ENIAC was begun in 1943, as part of the U.S’s war effort (as a classified military project known as “Project PX”); it was conceived and designed by John Mauchly and Presper Eckert of the University of Pennsylvania, where it was built.  The finished machine, composed of 17,468 electronic vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints, weighed more than 27 tons and occupied a 30 x 50 foot room– in its time the largest single electronic apparatus in the world.  ENIAC’s basic clock speed was 100,000 cycles per second. Today’s home computers have clock speeds of 1,000,000,000 cycles per second.

 source

 

Written by LW

February 16, 2019 at 1:01 am

%d bloggers like this: