(Roughly) Daily

Posts Tagged ‘communications

“Your job as a scientist is to figure out how you’re fooling yourself”*…

Larger version here

And like scientists, so all of us…

Science has shown that we tend to make all sorts of mental mistakes, called “cognitive biases”, that can affect both our thinking and actions. These biases can lead to us extrapolating information from the wrong sources, seeking to confirm existing beliefs, or failing to remember events the way they actually happened!

To be sure, this is all part of being human—but such cognitive biases can also have a profound effect on our endeavors, investments, and life in general.

Humans have a tendency to think in particular ways that can lead to systematic deviations from making rational judgments.

These tendencies usually arise from:

• Information processing shortcuts

• The limited processing ability of the brain

• Emotional and moral motivations

• Distortions in storing and retrieving memories

• Social influence

Cognitive biases have been studied for decades by academics in the fields of cognitive science, social psychology, and behavioral economics, but they are especially relevant in today’s information-packed world. They influence the way we think and act, and such irrational mental shortcuts can lead to all kinds of problems in entrepreneurship, investing, or management.

Here are five examples of how these types of biases can affect people in the business world:

1. Familiarity Bias: An investor puts her money in “what she knows”, rather than seeking the obvious benefits from portfolio diversification. Just because a certain type of industry or security is familiar doesn’t make it the logical selection.

2. Self-Attribution Bias: An entrepreneur overly attributes his company’s success to himself, rather than other factors (team, luck, industry trends). When things go bad, he blames these external factors for derailing his progress.

3. Anchoring Bias: An employee in a salary negotiation is too dependent on the first number mentioned in the negotiations, rather than rationally examining a range of options.

4. Survivorship Bias: Entrepreneurship looks easy, because there are so many successful entrepreneurs out there. However, this is a cognitive bias: the successful entrepreneurs are the ones still around, while the millions who failed went and did other things.

5. Gambler’s Fallacy: A venture capitalist sees a portfolio company rise and rise in value after its IPO, far behind what he initially thought possible. Instead of holding on to a winner and rationally evaluating the possibility that appreciation could still continue, he dumps the stock to lock in the existing gains.

An aid to thinking about thinking: “Every Single Cognitive Bias in One Infographic.” From DesignHacks.co via Visual Capitalist.

And for a fascinating look of cognitive bias’ equally dangerous cousin, innumeracy, see here.

* Saul Perlmutter, astrophysicist, Nobel laureate

###

As we cogitate, we might recall that it was on this date in 1859 that “The Carrington Event” began. Lasting two days, it was the largest solar storm on record: a large solar flare (a coronal mass ejection, or CME) that affected many of the (relatively few) electronics and telegraph lines on Earth.

A solar storm of this magnitude occurring today would cause widespread electrical disruptions, blackouts, and damage due to extended outages of the electrical grid. The solar storm of 2012 was of similar magnitude, but it passed Earth’s orbit without striking the planet, missing by nine days. See here for more detail on what such a storm might entail.

Sunspots of 1 September 1859, as sketched by R.C. Carrington. A and B mark the initial positions of an intensely bright event, which moved over the course of five minutes to C and D before disappearing.

“Foresight begins when we accept that we are now creating a civilization of risk”*…

There have been a handful folks– Vernor Vinge, Don Michael, Sherry Turkle, to name a few– who were, decades ago, exceptionally foresightful about the technologically-meditated present in which we live. Philip Agre belongs in their number…

In 1994 — before most Americans had an email address or Internet access or even a personal computer — Philip Agre foresaw that computers would one day facilitate the mass collection of data on everything in society.

That process would change and simplify human behavior, wrote the then-UCLA humanities professor. And because that data would be collected not by a single, powerful “big brother” government but by lots of entities for lots of different purposes, he predicted that people would willingly part with massive amounts of information about their most personal fears and desires.

“Genuinely worrisome developments can seem ‘not so bad’ simply for lacking the overt horrors of Orwell’s dystopia,” wrote Agre, who has a doctorate in computer science from the Massachusetts Institute of Technology, in an academic paper.

Nearly 30 years later, Agre’s paper seems eerily prescient, a startling vision of a future that has come to pass in the form of a data industrial complex that knows no borders and few laws. Data collected by disparate ad networks and mobile apps for myriad purposes is being used to sway elections or, in at least one case, to out a gay priest. But Agre didn’t stop there. He foresaw the authoritarian misuse of facial recognition technology, he predicted our inability to resist well-crafted disinformation and he foretold that artificial intelligence would be put to dark uses if not subjected to moral and philosophical inquiry.

Then, no one listened. Now, many of Agre’s former colleagues and friends say they’ve been thinking about him more in recent years, and rereading his work, as pitfalls of the Internet’s explosive and unchecked growth have come into relief, eroding democracy and helping to facilitate a violent uprising on the steps of the U.S. Capitol in January.

“We’re living in the aftermath of ignoring people like Phil,” said Marc Rotenberg, who edited a book with Agre in 1998 on technology and privacy, and is now founder and executive director for the Center for AI and Digital Policy…

As Reed Albergotti (@ReedAlbergotti) explains, better late than never: “He predicted the dark side of the Internet 30 years ago. Why did no one listen?

Agre’s papers are here.

* Jacques Ellul

###

As we consider consequences, we might recall that it was on this date in 1858 that Queen Victoria sent the first official telegraph message across the Atlantic Ocean from London to U. S. President James Buchanan, in Washington D.C.– an initiated a new era in global communications.

Transmission of the message began at 10:50am and wasn’t completed until 4:30am the next day, taking nearly eighteen hours to reach Newfoundland, Canada. Ninety-nine words, containing five hundred nine letters, were transmitted at a rate of about two minutes per letter.

After White House staff had satisfied themselves that it wasn’t a hoax, the President sent a reply of 143 words in a relatively rapid ten hours. Without the cable, a dispatch in one direction alone would have taken rouighly twelve days by the speediest combination of inland telegraph and fast steamer.

source

“O brave new world”*…

 

law and AI

 

With the arrival of autonomous weapons systems (AWS)[1] on the 21st century battlefield, the nature of warfare is poised for dramatic change.[2] Overseen by artificial intelligence (AI), fueled by terabytes of data and operating at lightning-fast speed, AWS will be the decisive feature of future military conflicts.[3] Nonetheless, under the American way of war, AWS will operate within existing legal and policy guidelines that establish conditions and criteria for the application of force.[4] Even as the Department of Defense (DoD) places limitations on when and how AWS may take action,[5] the pace of new conflicts and adoption of AWS by peer competitors will ultimately push military leaders to empower AI-enabled weapons to make decisions with less and less human input.[6] As such, timely, accurate, and context-specific legal advice during the planning and operation of AWS missions will be essential. In the face of digital-decision-making, mere human legal advisors will be challenged to keep up!

Fortunately, at the same time that AI is changing warfare, the practice of law is undergoing a similar AI-driven transformation.[7]

From The Judge Advocate General’s CorpsThe Reporter: “Autonomous Weapons Need Autonomous Lawyers.”

As I finish drafting this post [on October 5], I’ve discovered that none of the links are available any longer; the piece (and the referenced articles within it, also from The Reporter) were apparently removed from public view while I was drafting this– from a Reporter web page that, obviously, opened for me earlier.  You will find other references to (and excerpts from/comments on) the article here, here, and here.  I’m leaving the original links in, in case they become active again…

* Shakespeare, The Tempest

###

As we wonder if this can end well, we might recall that it was on this date in 1983 that Ameritech executive Bob Barnett made a phone call from a car parked near Soldier Field in Chicago, officially launching the first cellular network in the United States.

barnett-300x165

Barnett (foreground, in the car) and his audience

 

Written by (Roughly) Daily

October 13, 2019 at 1:01 am

“We never live; we are always in the expectation of living”*…

 

0618_phone

 

Sure that hold music was annoying, grating, a punishment to brain cells, especially if it loops tightly or is particularly in-your-face, but you know what’d be worse? If there was no sound at all.

That was the point that a man named Alfred Levy made when he filed a patent application in 1962 for the “Telephone hold program system,” which is the very patent that led to the creation of hold music.

A 2014 Slate piece helpfully filled in the gaps on this story: Levy, a factory employee, stumbled upon the basic idea for hold music after a freak incident involving a wire and a steel girder. Oddly enough, when the wire touched the steel, it effectively turned the building into a giant radio, leading people on hold waiting for phone calls to actually hear music on the line, rather than waiting in silence.

It might sound far-fetched, but that’s the tale, apparently. Nonetheless, his patent filing, granted in 1966, does a great job of explaining why such a tool is necessary. He noted that switchboards and telephone operators increasingly were using hold buttons, which allow time to properly route calls through a switchboard. However, little consideration was being given to the person on the other end of the line, who understandably might get frustrated or concerned the call dropped if they don’t hear back after a while.

“Courteous telephone practice requires that a held caller be assured at reasonable intervals that the party to whom he wishes to speak still is busy but the pressure of her duties may prevent the operator from so advising the incoming caller so that he may be bereft of even this small consolation,” the patent filing stated. “In any event, listening to a completely unresponsive instrument is tedious and calls often are abandoned altogether or remade which leads to annoyance and a waste of time and money.”

A telephone hold system, he continues, is basically a way to pacify the person waiting for assistance, as it “assures the incoming caller that his call is being held and that he is not disconnected or forgotten.”

The timing of his invention was basically perfect, coming along right as the call center was making its first appearance

Via @ShortFormErnie and his always-illuminating Tedium, the unusual state of hold music, which works pretty much the opposite way that every other kind of music does, for reasons both technical and psychological: “Holding Patterns.”

* Voltaire

###

As we wait, we might recall that it was on this date in 1976 that the CN Tower in Toronto opened.  At 1,815.3 ft it held the record for the world’s tallest free-standing structure for 32 years until 2007, when it was surpassed by the Burj Khalifa and was the world’s tallest tower until 2009 when it was surpassed by the Canton Tower.  In 1995, the CN Tower was declared one of the modern Seven Wonders of the World by the American Society of Civil Engineers. (It also belongs to the World Federation of Great Towers.)  It serves as a communications tower, the site of numerous broadcast (and reception) antennae for TV, radio, cell phone and microwave providers… and, of course, it is a signature icon of Toronto’s skyline.

480px-Toronto_-_ON_-_Toronto_Harbourfront7 source

 

Written by (Roughly) Daily

June 26, 2019 at 1:01 am

“Information is a difference that makes a difference”*…

 

Shannon information

 

Information was something guessed at rather than spoken of, something implied in a dozen ways before it was finally tied down. Information was a presence offstage. It was there in the studies of the physiologist Hermann von Helmholtz, who, electrifying frog muscles, first timed the speed of messages in animal nerves just as Thomson was timing the speed of messages in wires. It was there in the work of physicists like Rudolf Clausius and Ludwig Boltzmann, who were pioneering ways to quantify disorder—entropy—little suspecting that information might one day be quantified in the same way. Above all, information was in the networks that descended in part from the first attempt to bridge the Atlantic with underwater cables. In the attack on the practical engineering problems of connecting Points A and B—what is the smallest number of wires we need to string up to handle a day’s load of messages? how do we encrypt a top-secret telephone call?—the properties of information itself, in general, were gradually uncovered.

By the time of Claude Shannon’s childhood, the world’s communications networks were no longer passive wires acting as conduits for electricity, a kind of electron plumbing. They were continent-spanning machines, arguably the most complex machines in existence. Vacuum-tube amplifiers strung along the telephone lines added power to voice signals that would have otherwise attenuated and died out on their thousand-mile journeys. A year before Shannon was born, in fact, Bell and Watson inaugurated the transcontinental phone line by reenacting their first call, this time with Bell in New York and Watson in San Francisco. By the time Shannon was in elementary school, feedback systems managed the phone network’s amplifiers automatically, holding the voice signals stable and silencing the “howling” or “singing” noises that plagued early phone calls, even as the seasons turned and the weather changed around the sensitive wires that carried them. Each year that Shannon placed a call, he was less likely to speak to a human operator and more likely to have his call placed by machine, by one of the automated switchboards that Bell Labs grandly called a “mechanical brain.” In the process of assembling and refining these sprawling machines, Shannon’s generation of scientists came to understand information in much the same way that an earlier generation of scientists came to understand heat in the process of building steam engines.

It was Shannon who made the final synthesis, who defined the concept of information and effectively solved the problem of noise. It was Shannon who was credited with gathering the threads into a new science…

The story of Claude Shannon, his colorful life–  and the birth of the Information Age: “How Information Got Re-Invented.”

* Gregory Bateson

###

As we separate the signal from the noise, we might send communicative birthday greetings to the subject of today’s main post, Claude Elwood Shannon; he was born on this date in 1916.  A mathematician, electrical engineer, and cryptographer, he is, for reasons explained in the article featured above, known as “the father of information theory.”  But he is also remembered for his contributions to digital circuit design theory and for his cryptanalysis work during World War II, both as a codebreaker and as a designer of secure communications systems.

220px-ClaudeShannon_MFO3807 source

 

%d bloggers like this: