Posts Tagged ‘Bell Labs’
“For what are myths if not the imposing of order on phenomena that do not possess order in themselves? And all myths, however they differ from philosophical systems and scientific theories, share this with them, that they negate the principle of randomness in the world.”*…
And we humans are, as Kit Yates explains, myth-making animals…
Unfortunately, when it comes to understanding random phenomena, our intuition often lets us down. Take a look at the image below. Before you read the caption, see if you can pick out the data set generated using truly uniform random numbers for the coordinates of the dots (i.e., for each point, independent of the others, the horizontal coordinate is equally likely to fall anywhere along the horizontal axis and the vertical coordinate is equally likely to fall anywhere along the vertical).

The truly randomly distributed points in the figure are those in the left-most image. The middle image represents the position of ants’ nests that, although distributed with some randomness, demonstrate a tendency to avoid being too close together in order not to overexploit the same resources. The territorial Patagonian seabirds’ nesting sites, in the right-most image, exhibit an even more regular and well-spaced distribution, preferring not to be too near to their neighbors when rearing their young. The computer-generated points, distributed uniformly at random in the left-hand image, have no such qualms about their close proximity.
If you chose the wrong option, you are by no means alone. Most of us tend to think of randomness as being “well spaced.” The tight clustering of dots and the frequent wide gaps of the genuinely random distribution seem to contradict our inherent ideas of what randomness should look like…
…
… As a case in point, after noticing a disproportionate number of Steely Dan songs playing on his iPod shuffle, journalist Steven Levy questioned Steve Jobs directly about whether “shuffle” was truly random. Jobs assured him that it was and even got an engineer on the phone to confirm it. A follow-up article Levy wrote in Newsweek garnered a huge response from readers having similar experiences, questioning, for example, how two Bob Dylan songs shuffled to play one after the other (from among the thousands of songs in their collections) could possibly be random.
We ascribe meaning too readily to the clustering that randomness produces, and, consequently, we deduce that there is some generative force behind the pattern. We are hardwired to do this. The “evolutionary” argument holds that tens of thousands of years ago, if you were out hunting or gathering in the forest and you heard a rustle in the bushes, you’d be wise to play it safe and to run away as fast as you could. Maybe it was a predator out looking for their lunch and by running away you saved your skin. Probably, it was just the wind randomly whispering in the leaves and you ended up looking a little foolish—foolish, but alive and able to pass on your paranoid pattern-spotting genes to the next generation…
…
This… is just one example of the phenomenon known in the psychology literature as pareidolia, in which an observer interprets an ambiguous auditory or visual stimulus as something they are familiar with. This phenomenon, otherwise known as “patternicity,” allows people to spot shapes in the clouds and is the reason why people think they see a man in the moon. Pareidolia is itself an example of the more general phenomenon of apophenia, in which people mistakenly perceive connections between and ascribe meaning to unrelated events or objects. Apophenia’s misconstrued connections lead us to validate incorrect hypotheses and draw illogical conclusions. Consequently, the phenomenon lies at the root of many conspiracy theories—think, for example, of extraterrestrial seekers believing that any bright light in the sky is a UFO.
Apophenia sends us looking for the cause behind the effect when, in reality, there is none at all. When we hear two songs by the same artist back-to-back, we are too quick to cry foul in the belief that we have spotted a pattern, when in fact these sorts of clusters are an inherent feature of randomness. Eventually, the dissatisfaction caused by the clustering inherent to the iPod’s genuinely random shuffle algorithm led Steve Jobs to implement the new “Smart Shuffle” feature on the iPod, which meant that the next song played couldn’t be too similar to the previous song, better conforming to our misconceived ideas of what randomness looks like. As Jobs himself quipped, “We’re making it less random to make it feel more random.”…
“Why Randomness Doesn’t Feel Random,” an excerpt from How to Expect the Unexpected: The Science of Making Predictions—and the Art of Knowing When Not To, by @Kit_Yates_Maths in @behscientist.
* Stanislaw Lem
###
As we ponder purported patterns, we might send carefully-discerned birthday greetings to a man who did in fact find a pattern (or at least a meaning) in what might have seemed random and meaningless: Robert Woodrow Wilson; he was born on this date in 1936. An astronomer, he detected– with Bell Labs colleague Arno Penzias– cosmic microwave background radiation: “relic radiation”– that’s to say, the “sound “– of the Big Bang… familiar to those of us old enough to remember watching an old-fashioned television after the test pattern was gone (when there was no broadcast signal received): the “fuzz” we saw and the static-y sounds we heard, were that “relic radiation” being picked up.
Their 1964 discovery earned them the 1978 Nobel Prize in Physics.

“On the one hand the computer makes it possible in principle to live in a world of plenty for everyone, on the other hand we are well on our way to using it to create a world of suffering and chaos. Paradoxical, no?”*…
Joseph Weizenbaum, a distinguished professor at MIT, was one of the fathers of artificial intelligence and computing as we know it; he was also one of his earliest critics– one whose concerns remain all too current. After a review of his warnings, Librarian Shipwreck shares a still-relevant set of questions Weizenbaum proposed…
At the end of his essay “Once more—A Computer Revolution” which appeared in the Bulletin of the Atomic Scientists in 1978, Weizenbaum concluded with a set of five questions. As he put it, these were the sorts of questions that “are almost never asked” when it comes to this or that new computer related development. These questions did not lend themselves to simple yes or no answers, but instead called for serious debate and introspection. Thus, in the spirit of that article, let us conclude this piece not with definitive answers, but with more questions for all of us to contemplate. Questions that were “almost never asked” in 1978, and which are still “almost never asked” in 2023. They are as follows:
• Who is the beneficiary of our much-advertised technological progress and who are its victims?
• What limits ought we, the people generally and scientists and engineers particularly, to impose on the application of computation to human affairs?
• What is the impact of the computer, not only on the economies of the world or on the war potential of nations, etc…but on the self-image of human beings and on human dignity?
• What irreversible forces is our worship of high technology, symbolized most starkly by the computer, bringing into play?
• Will our children be able to live with the world we are here and now constructing?
As Weizenbaum put it “much depends on answers to these questions.”
Much still depends on answers to these questions.
Eminently worth reading in full: “‘Computers enable fantasies’ – on the continued relevance of Weizenbaum’s warnings,” from @libshipwreck.
See also: “An island of reason in the cyberstream – on the life and thought of Joseph Weizenbaum.”
* Joseph Weizenbaum (1983)
###
As we stay grounded, we might spare a thought for George Stibitz; he died on this date in 1995. A Bell Labs researcher, he was known for his work in the 1930s and 1940s on the realization of Boolean logic digital circuits using electromechanical relays as the switching element– work for which he is internationally recognized as one of the fathers of the modern digital computer.
In 1937, Stibitz, a scientist at Bell Laboratories built a digital machine based on relays, flashlight bulbs, and metal strips cut from tin-cans. He called it the “Model K” because most of it was constructed on his kitchen table. It worked on the principle that if two relays were activated they caused a third relay to become active, where this third relay represented the sum of the operation. Then, in 1940, he gave a demonstration of the first remote operation of a computer.
“The ‘paradox’ is only a conflict between reality and your feeling of what reality ‘ought to be’”*…

Elegant experiments with entangled light have laid bare a profound mystery at the heart of reality. Daniel Garisto explains the importance of the work done by this year’s Nobel laureates in Physics…
One of the more unsettling discoveries in the past half century is that the universe is not locally real. “Real,” meaning that objects have definite properties independent of observation—an apple can be red even when no one is looking; “local” means objects can only be influenced by their surroundings, and that any influence cannot travel faster than light. Investigations at the frontiers of quantum physics have found that these things cannot both be true. Instead, the evidence shows objects are not influenced solely by their surroundings and they may also lack definite properties prior to measurement. As Albert Einstein famously bemoaned to a friend, “Do you really believe the moon is not there when you are not looking at it?”
This is, of course, deeply contrary to our everyday experiences. To paraphrase Douglas Adams, the demise of local realism has made a lot of people very angry and been widely regarded as a bad move.
Blame for this achievement has now been laid squarely on the shoulders of three physicists: John Clauser, Alain Aspect and Anton Zeilinger. They equally split the 2022 Nobel Prize in Physics “for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science.” (“Bell inequalities” refers to the pioneering work of the Northern Irish physicist John Stewart Bell, who laid the foundations for this year’s Physics Nobel in the early 1960s.) Colleagues agreed that the trio had it coming, deserving this reckoning for overthrowing reality as we know it. “It is fantastic news. It was long overdue,” says Sandu Popescu, a quantum physicist at the University of Bristol. “Without any doubt, the prize is well-deserved.”
“The experiments beginning with the earliest one of Clauser and continuing along, show that this stuff isn’t just philosophical, it’s real—and like other real things, potentially useful,” says Charles Bennett, an eminent quantum researcher at IBM…
Quantum foundations’ journey from fringe to favor was a long one. From about 1940 until as late as 1990, the topic was often treated as philosophy at best and crackpottery at worst. Many scientific journals refused to publish papers in quantum foundations, and academic positions indulging such investigations were nearly impossible to come by…
Today, quantum information science is among the most vibrant and impactful subfields in all of physics. It links Einstein’s general theory of relativity with quantum mechanics via the still-mysterious behavior of black holes. It dictates the design and function of quantum sensors, which are increasingly being used to study everything from earthquakes to dark matter. And it clarifies the often-confusing nature of quantum entanglement, a phenomenon that is pivotal to modern materials science and that lies at the heart of quantum computing…
Eminently worth reading in full: “The Universe Is Not Locally Real, and the Physics Nobel Prize Winners Proved It,” from @dangaristo in @sciam.
Apposite: entangled particles and wormholes could be manifestations of the same phenomenon, and resolve paradoxes like information escaping a black hole: “Black Holes May Hide a Mind-Bending Secret About Our Universe.”
###
As we rethink reality, we might spare a thought for Walter Brattain; he died on this date in 1987. A physicist (at Bell Labs at the time), he worked with John Bardeen and William Shockley to invent the point-contact transistor in 1947, the birth of the semiconductor– work for which the trio shared the Nobel Prize in Physics in 1956.
At college, Brattain said, he majored in physics and math because they were the only subjects he was good at. He became a solid physicist with a good understanding of theory, but his strength was in physically constructing experiments. Working with the ideas of Shockley and Bardeen, Brattain’s hands built the first transistor. Shortly, the transistor replaced the bulkier vacuum tube for many uses and was the forerunner of microminiature electronic parts.
As semiconductor technology has advanced, it has begun to incorporate quantum effects.
“Progress means getting nearer to the place you want to be. And if you have taken a wrong turn, then to go forward does not get you any nearer.”*…
Earlier (Roughly) Daily posts have looked at “Progress Studies” and at its relationship to the Rationalism community. Garrison Lovely takes a deeper look at this growing and influential intellectual movement that aims to understand why human progress happens – and how to speed it up…
For most of history, the world improved at a sluggish pace, if at all. Civilisations rose and fell. Fortunes were amassed and squandered. Almost every person in the world lived in what we would now call extreme poverty. For thousands of years, global wealth – at least our best approximations of it – barely budged.
But beginning around 150-200 years ago, everything changed. The world economy suddenly began to grow exponentially. Global life expectancy climbed from less than 30 years to more than 70 years. Literacy, extreme poverty, infant mortality, and even height improved in a similarly dramatic fashion. The story may not be universally positive, nor have the benefits been equally distributed, but by many measures, economic growth and advances in science and technology have changed the way of life for billions of people.
What explains this sudden explosion in relative wealth and technological power? What happens if it slows down, or stagnates? And if so, can we do something about it? These are key questions of “progress studies”, a nascent self-styled academic field and intellectual movement, which aims to dissect the causes of human progress in order to better advance it.
Founded by an influential economist and a billionaire entrepreneur, this community tends to define progress in terms of scientific or technological advancement, and economic growth – and therefore their ideas and beliefs are not without their critics. So, what does the progress studies movement believe, and what do they want to see happen in the future?
Find out at: “Do we need a better understanding of ‘progress’?,” from @GarrisonLovely at @BBC_Future.
Then judge for yourself: was Adorno right? “It would be advisable to think of progress in the crudest, most basic terms: that no one should go hungry anymore, that there should be no more torture, no more Auschwitz. Only then will the idea of progress be free from lies.” Or can–should– we be more purposively, systemically ambitious?
* C. S. Lewis
###
As we get better at getting better, we might recall that it was on this date in 1922 that the United States paid tribute to a man instrumental in the progress that Progress Studies is anxious to sustain, Alexander Graham Bell…
There were more than 14 million telephones in the United States by the time Alexander Graham Bell died. For one minute on August 4, 1922, they were all silent.
The reason: Bell’s funeral. The American inventor was the first to patent telephone technology in the United States and who founded the Bell Telephone System in 1877. Though Bell wasn’t the only person to invent “the transmission of speech by electrical wires,” writes Randy Alfred for Wired, achieving patent primacy in the United States allowed him to spend his life inventing. Even though the telephone changed the world, Bell didn’t stop there.
Bell died on August 2, 1922, just a few days after his 75th birthday. “As a mark of respect every telephone exchange in the United States and Canada closed for a minute when his funeral began around 6:30 p.m. Eastern Standard Time,” Alfred writes.
On the day of the funeral, The New York Times reported that Bell was also honored by advocates for deaf people. “Entirely apart from the monumental achievement of Professor Bell as the inventor of the telephone, his conspicuous work in [sic] behalf of the deaf of this country would alone entitle him to everlasting fame,” said Felix H. Levey, president of the Institution for the Improved Instruction of Deaf Mutes.
In fact, Bell spent much of his income from the telephone on helping deaf people. The same year he founded the Bell Telephone System, 1880, Bell founded the Volta Laboratory. The laboratory, originally called Volta Associates, capitalized on Bell’s work and the work of other sound pioneers. It made money by patenting new innovations for the gramophone and other recorded sound technologies. In 1887, Bell took his share of the money from the sale of gramophone patents and founded the Volta Bureau “as an instrument for the increase and diffusion of knowledge relating to the Deaf,’” writes the National Park Service. Bell and Volta continued to work for deaf rights throughout his life.
Volta Laboratory eventually became Bell Laboratories, which was home to many of the twentieth century’s communication innovations.
Smithsonian
“Any sufficiently advanced technology is equivalent to magic”*…
In the 1930s, ATT was rolling out dial phones to the American public…
This short subject newsreel was shown in movie theaters the week before a town’s or region’s telephone exchange was to be converted to dial service. It’s extremely short—a little over a minute, like a PSA. The film concisely explains how to use a dial telephone, including how to dial, how to recognize dial tone, and how to recognize a busy signal…
For a look into the then-future (the now present), fast forward just over 50 years, to the early 90s and to ATT’s predictions…
More in ATT Tech Channel.
[TotH to @BoingBoing for a pointer to the first video]
* Arthur C. Clarke (a 1976 interview with whom is in the Tech Channel trove)
###
As we ponder progress, we might send , ATT-related birthday greetings to Robert Woodrow Wilson; he was born on this date in 1936. An astronomer, he detected– with Bell Labs colleague Arno Penzias– cosmic microwave background radiation: “relic radiation”– that’s to say, the “sound “– of the Big Bang…. familiar to those of old enough to remember watching an old-fashioned television after the test pattern was gone (when there was no broadcast signal received): the “fuzz” we saw and the static-y sounds we heard, were the “relic radiation” being picked up.
Their 1964 discovery earned them the 1978 Nobel Prize in Physics.








You must be logged in to post a comment.