(Roughly) Daily

Posts Tagged ‘Technology

“Please cut off a nanosecond and send it over to me”*…

960px-Commodore_Grace_M._Hopper,_USN_(covered)

 

“Amazing  Grace” Hopper, seminal computer scientist and Rear Admiral in the U.S. Navy, explains a “nanosecond”…

* Grace Hopper

###

As we celebrate clarity, we might recall that it was on this date in 1961 that Robert Noyce was issued patent number 2981877 for his “semiconductor device-and-lead structure,” the first patent for what would come to be known as the integrated circuit.  In fact another engineer, Jack Kilby, had separately and essentially simultaneously developed the same technology (Kilby’s design was rooted in germanium; Noyce’s in silicon) and had filed a few months earlier than Noyce… a fact that was recognized in 2000 when Kilby was Awarded the Nobel Prize– in which Noyce, who had died in 1990, did not share.

Noyce (left) and Kilby (right)

 source

 

 

 

Written by LW

April 25, 2019 at 1:01 am

“Our major obligation is not to mistake slogans for solutions”*…

 

luo-slownews

 

In 2008, the Columbia Journalism Review published an article with the headline “Overload!,” which examined news fatigue in “an age of too much information.” When “Overload!” was published, BlackBerrys still dominated the smartphone market, push notifications hadn’t yet to come to the iPhone, retweets weren’t built into Twitter, and BuzzFeed News did not exist. Looking back, the idea of suffering from information overload in 2008 seems almost quaint. Now, more than a decade later, a fresh reckoning seems to be upon us. Last year, Tim Cook, the chief executive officer of Apple, unveiled a new iPhone feature, Screen Time, which allows users to track their phone activity. During an interview at a Fortune conference, Cook said that he was monitoring his own usage and had “slashed” the number of notifications he receives. “I think it has become clear to all of us that some of us are spending too much time on our devices,” Cook said.

It is worth considering how news organizations have contributed to the problems Newport and Cook describe. Media outlets have been reduced to fighting over a shrinking share of our attention online; as Facebook, Google, and other tech platforms have come to monopolize our digital lives, news organizations have had to assume a subsidiary role, relying on those sites for traffic. That dependence exerts a powerful influence on which stories are pursued, how they’re presented, and the speed and volume at which they’re turned out…

A central purpose of journalism is the creation of an informed citizenry. And yet—especially in an environment of free-floating, ambient news—it’s not entirely clear what it means to be informed: “The Urgent Quest for Slower, Better News.”

* Edward R. Murrow

###

As we break news, we might recall that it was on this date in 1704 that the first issue of The Boston News-Letter was published.  Heavily subsidized by the British government, with a limited circulation, it was the first continuously-published newspaper in North America.  The colonies’ first newspaper was (the rather more editorially-independent)  Publick Occurrences Both Forreign and Domestick, which published its first and only issue on September 25, 1690.)

440px-Boston_News-Letter_(first_issue)source

 

Written by LW

April 24, 2019 at 1:01 am

“Big Data is like teenage sex: everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it”*…

 

a-day-in-data-1200

 

You’ve probably heard of kilobytes, megabytes, gigabytes, or even terabytes.

These data units are common everyday amounts that the average person may run into. Units this size may be big enough to quantify the amount of data sent in an email attachment, or the data stored on a hard drive, for example.

In the coming years, however, these common units will begin to seem more quaint – that’s because the entire digital universe is expected to reach 44 zettabytes by 2020.

If this number is correct, it will mean there are 40 times more bytes than there are stars in the observable universe…

The stuff of dreams, the stuff of nightmares: “How Much Data is Generated Each Day?

* Dan Ariely

###

As we revel in really, really big numbers, we might spare a thought for Edgar Frank “Ted” Codd; he died on this date in 2003.  A distinguished computer scientist who did important work on cellular automata, he is best remembered as the father of computer databases– as the person who laid the foundation for for relational databases, for storing and retrieving information in computer records.

150px-Edgar_F_Coddsource

 

Written by LW

April 18, 2019 at 1:01 am

“Status is welcome, agreeable, pleasant, and hard to obtain in the world”*…

 

fur-300x198

 

“It is a truth universally acknowledged, that a person in possession of little fortune, must be in want of more social capital.”

So wrote Jane Austen, or she would have, I think, if she were chronicling our current age (instead we have Taylor Lorenz, and thank goodness for that).

Let’s begin with two principles:

  • People are status-seeking monkeys*
  • People seek out the most efficient path to maximizing social capital

I begin with these two observations of human nature because few would dispute them, yet I seldom see social networks, some of the largest and fastest-growing companies in the history of the world, analyzed on the dimension of status or social capital.

It’s in part a measurement issue. Numbers lend an air of legitimacy and credibility. We have longstanding ways to denominate and measure financial capital and its flows. Entire websites, sections of newspapers, and a ton of institutions report with precision on the prices and movements of money.

We have no such methods for measuring the values and movement of social capital, at least not with anywhere near the accuracy or precision. The body of research feels both broad and yet meager. If we had better measures besides user counts, this piece and many others would be full of charts and graphs that added a sense of intellectual heft to the analysis. There would be some annual presentation called the State of Social akin to Meeker’s Internet Trends Report, or perhaps it would be a fifty page sub-section of her annual report.

Despite this, most of the social media networks we study generate much more social capital than actual financial capital, especially in their early stages; almost all such companies have internalized one of the popular truisms of Silicon Valley, that in the early days, companies should postpone revenue generation in favor of rapid network scaling. Social capital has much to say about why social networks lose heat, stall out, and sometimes disappear altogether. And, while we may not be able to quantify social capital, as highly attuned social creatures, we can feel it.

Social capital is, in many ways, a leading indicator of financial capital, and so its nature bears greater scrutiny. Not only is it good investment or business practice, but analyzing social capital dynamics can help to explain all sorts of online behavior that would otherwise seem irrational.

In the past few years, much progress has been made analyzing Software as a Service (SaaS) businesses. Not as much has been made on social networks. Analysis of social networks still strikes me as being like economic growth theory long before Paul Romer’s paper on endogenous technological change. However, we can start to demystify social networks if we also think of them as SaaS businesses, but instead of software, they provide status. This post is a deep dive into what I refer to as Status as a Service (StaaS) businesses…

Eugene Wei (of Amazon, Hulu, and Flipboard, among other tech successes) on the implications of our hunger for recognition and rank: “Status as a Service (StaaS).”

Pair with: “Understanding Tradeoffs (pt. 2): Breaking the Altruism vs. Capitalism Dichotomy.”

[Image above: source]

* Buddha [Ittha Sutta, AN 5.43]

###

As we contemplate our craving, we might recall that it was on this date in 1845 that a method for manufacturing elastic (rubber) bands was patented in Britain by Stephen Perry and and Thomas Barnabas Daft of London (G.B. No. 13880/1845).

In the early 19th century, sailors had brought home items made by Central and South American natives from the sap of rubber trees, including footwear, garments and bottles.  Around 1820, a Londoner named Thomas Hancock sliced up one of the bottles to create garters and waistbands. By 1843, he had secured patent rights from Charles Macintosh for vulcanized India rubber.  (Vulcanization made rubber stable and retain its elasticity.)  Stephen Perry, owner of Messrs Perry and Co,. patented the use of India rubber for use as springs in bands, belts, etc., and (with Daft) also the manufacture of elastic bands by slicing suitable sizes of vulcanized India rubber tube.  The bands were lightly scented to mask the smell of the treated rubber.

 source

 

Written by LW

March 17, 2019 at 1:01 am

“Outward show is a wonderful perverter of the reason”*…

 

facial analysis

Humans have long hungered for a short-hand to help in understanding and managing other humans.  From phrenology to the Myers-Briggs Test, we’ve tried dozens of short-cuts… and tended to find that at best they weren’t actually very helpful; at worst, they were reinforcing of stereotypes that were inaccurate, and so led to results that were unfair and ineffective.  Still, the quest continues– these days powered by artificial intelligence.  What could go wrong?…

Could a program detect potential terrorists by reading their facial expressions and behavior? This was the hypothesis put to the test by the US Transportation Security Administration (TSA) in 2003, as it began testing a new surveillance program called the Screening of Passengers by Observation Techniques program, or Spot for short.

While developing the program, they consulted Paul Ekman, emeritus professor of psychology at the University of California, San Francisco. Decades earlier, Ekman had developed a method to identify minute facial expressions and map them on to corresponding emotions. This method was used to train “behavior detection officers” to scan faces for signs of deception.

But when the program was rolled out in 2007, it was beset with problems. Officers were referring passengers for interrogation more or less at random, and the small number of arrests that came about were on charges unrelated to terrorism. Even more concerning was the fact that the program was allegedly used to justify racial profiling.

Ekman tried to distance himself from Spot, claiming his method was being misapplied. But others suggested that the program’s failure was due to an outdated scientific theory that underpinned Ekman’s method; namely, that emotions can be deduced objectively through analysis of the face.

In recent years, technology companies have started using Ekman’s method to train algorithms to detect emotion from facial expressions. Some developers claim that automatic emotion detection systems will not only be better than humans at discovering true emotions by analyzing the face, but that these algorithms will become attuned to our innermost feelings, vastly improving interaction with our devices.

But many experts studying the science of emotion are concerned that these algorithms will fail once again, making high-stakes decisions about our lives based on faulty science…

“Emotion detection” has grown from a research project to a $20bn industry; learn more about why that’s a cause for concern: “Don’t look now: why you should be worried about machines reading your emotions.”

* Marcus Aurelius, Meditations

###

As we insist on the individual, we might recall that it was on this date in 1989 that Tim Berners-Lee submitted a proposal to CERN for developing a new way of linking and sharing information over the Internet.

It was the first time Berners-Lee proposed a system that would ultimately become the World Wide Web; but his proposal was basically a relatively vague request to research the details and feasibility of such a system.  He later submitted a proposal on November 12, 1990 that much more directly detailed the actual implementation of the World Wide Web.

web25-significant-white-300x248 source

 

“To paraphrase several sages: Nobody can think and hit someone at the same time”*…

 

Beast within

“Stranded on the Island of Circe” by Paul Reid

 

What was the driving force that made us human, akin to but separate from other apes and our evolutionary cousins such as the Neanderthals? In The Goodness Paradox, the anthropologist Richard Wrangham approvingly quotes Frederick the Great in pointing to “the wild beast” within each man: our nature, he argues, is rooted in an animal violence that morphed over time to become uniquely human. When male human ancestors began to plot together to execute aggressive men in their communities, indeed to carry out such killings through what Wrangham calls “coalitionary proactive aggression”, they were launched towards full humanity…

At some point after the evolutionary split from the non-human ape lineage – probably around 300,000 years ago, Wrangham thinks – our male ancestors began to do what the chimpanzees could not: plot together to execute aggressive males in their own social groups. How do we know this? Because we see evidence of “the domestication syndrome” under way in our ancestors at this time, indicating that they were becoming less in thrall to reactive aggression…

During human evolution, of course, no other more dominant species controlled the process: instead, we domesticated ourselves by eliminating the most aggressive males in our social groups. Our bodies did signal what was happening. Around 315,000 years ago, for example, “the first glimmerings of the smaller face and reduced brow ridge [compared to earlier human ancestors] that signal the evolution of Homo sapiens” began to show up. Sex differences in the skeleton soon began to diminish. Our species was set apart from all other human-like ones, including the Neanderthals, who did not self-domesticate…

How the human species domesticated itself: “Wild beast within.”

* Susan Sontag, Regarding the Pain of Others

###

As we take it easy. we might recall that it was on this date in 1836 that Samuel Colt and a group of financial backers chartered that Patent Arms Manufacturing Company of Paterson, New Jersey, a company formed to produce what became the first commercially-successful revolvers.  The revolver was pioneered by other inventors; Colt’s great contribution was the use of interchangeable parts.  He envisioned that all the parts of every Colt gun would be be interchangeable and made by machine, to be assembled later by hand– that’s to say, his goal, later realized, was an assembly line.

220px-Samuel_Colt_engraving_by_John_Chester_Buttre,_c1855 source

 

“By far, the greatest danger of Artificial Intelligence is that people conclude too early that they understand it”*…

 

robit writer

 

Recently, OpenAI announced its latest breakthrough, GPT-2, a language model that can write essays to a prompt, answer questions, and summarize longer works… sufficiently successfully that OpenAI has said that it’s too dangerous to release the code (lest it result in “deepfake news” or other misleading mischief).

Scott Alexander contemplates the results.  His conclusion:

a brain running at 5% capacity is about as good as the best AI that the brightest geniuses working in the best-equipped laboratories in the greatest country in the world are able to produce in 2019. But:

We believe this project is the first step in the direction of developing large NLP systems without task-specific training data. That is, we are developing a machine language system in the generative style with no explicit rules for producing text. We hope for future collaborations between computer scientists, linguists, and machine learning researchers.

A boring sentiment from an interesting source: the AI wrote that when asked to describe itself. We live in interesting times.

His complete post, eminently worthy of reading in full: “Do Neural Nets Dream of Electric Hobbits?

[image above, and another account of OpenAI’s creation: “OpenAI says its new robo-writer is too dangerous for public release“]

* Eliezer Yudkowsky

###

As we take the Turing Test, we might send elegantly-designed birthday greetings to Steve Jobs; he was born on this date in 1955.  While he is surely well-known to every reader here, let us note for the record that he was was instrumental in developing the Macintosh, the computer that took Apple to unprecedented levels of success.  After leaving the company he started with Steve Wozniak, Jobs continued his personal computer development at his NeXT Inc.  In 1997, Jobs returned to Apple to lead the company into a new era based on NeXT technologies and consumer electronics.  Some of Jobs’ achievements in this new era include the iMac, the iPhone, the iTunes music store, the iPod, and the iPad.  Under Jobs’ leadership Apple was at one time the world’s most valuable company. (And, of course, he bought Pixar from George Lucas, and oversaw both its rise to animation dominance and its sale to Disney– as a product of which Jobs became Disney’s largest single shareholder.)

Jobs source

 

Written by LW

February 24, 2019 at 1:01 am

%d bloggers like this: