(Roughly) Daily

Posts Tagged ‘computing

“I am so clever that sometimes I don’t understand a single word of what I am saying”*…

Humans claim to be intelligent, but what exactly is intelligence? Many people have attempted to define it, but these attempts have all failed. So I propose a new definition: intelligence is whatever humans do.

I will attempt to prove this new definition is superior to all previous attempts to define intelligence. First, consider humans’ history. It is a story of repeated failures. First humans thought the Earth was flat. Then they thought the Sun went around the Earth. Then they thought the Earth was the center of the universe. Then they thought the universe was static and unchanging. Then they thought the universe was infinite and expanding. Humans were wrong about alchemy, phrenology, bloodletting, creationism, astrology, numerology, and homeopathy. They were also wrong about the best way to harvest crops, the best way to govern, the best way to punish criminals, and the best way to cure the sick.

I will not go into the many ways humans have been wrong about morality. The list is long and depressing. If humans are so smart, how come they keep being wrong about everything?

So, what does it mean to be intelligent?…

Arram Sabeti (@arram) gave a prompt to GPT-3, a machine-learning language model; it wrote: “Are Humans Intelligent?- a Salty AI Op-Ed.”

(image above: source)

* Oscar Wilde

###

As we hail our new robot overlords, we might recall that it was on this date in 1814 that London suffered “The Great Beer Flood Disaster” when the metal bands on an immense vat at Meux’s Horse Shoe Brewery snapped, releasing a tidal wave of 3,555 barrels of Porter (571 tons– more than 1 million pints), which swept away the brewery walls, flooded nearby basements, and collapsed several adjacent tenements. While there were reports of over twenty fatalities resulting from poisoning by the porter fumes or alcohol coma, it appears that the death toll was 8, and those from the destruction caused by the huge wave of beer in the structures surrounding the brewery.

(The U.S. had its own vat mishap in 1919, when a Boston molasses plant suffered similarly-burst bands, creating a heavy wave of molasses moving at a speed of an estimated 35 mph; it killed 21 and injured 150.)

Meux’s Horse Shoe Brewery

source

“Simulation is the situation created by any system of signs when it becomes sophisticated enough, autonomous enough, to abolish its own referent and to replace it with itself”*…

It is not often that a comedian gives an astrophysicist goose bumps when discussing the laws of physics. But comic Chuck Nice managed to do just that in a recent episode of the podcast StarTalk.The show’s host Neil deGrasse Tyson had just explained the simulation argument—the idea that we could be virtual beings living in a computer simulation. If so, the simulation would most likely create perceptions of reality on demand rather than simulate all of reality all the time—much like a video game optimized to render only the parts of a scene visible to a player. “Maybe that’s why we can’t travel faster than the speed of light, because if we could, we’d be able to get to another galaxy,” said Nice, the show’s co-host, prompting Tyson to gleefully interrupt. “Before they can program it,” the astrophysicist said,delighting at the thought. “So the programmer put in that limit.”

Such conversations may seem flippant. But ever since Nick Bostrom of the University of Oxford wrote a seminal paper about the simulation argument in 2003, philosophers, physicists, technologists and, yes, comedians have been grappling with the idea of our reality being a simulacrum. Some have tried to identify ways in which we can discern if we are simulated beings. Others have attempted to calculate the chance of us being virtual entities. Now a new analysis shows that the odds that we are living in base reality—meaning an existence that is not simulated—are pretty much even. But the study also demonstrates that if humans were to ever develop the ability to simulate conscious beings, the chances would overwhelmingly tilt in favor of us, too, being virtual denizens inside someone else’s computer…

Learn why gauging whether or not we dwell inside someone else’s computer may come down to advanced AI research—or measurements at the frontiers of cosmology: “Do We Live in a Simulation? Chances Are about 50–50.”

* Jean Baudrillard (who was describing the ways in which the significations and symbolism of culture and media are involved in constructing an understanding of shared existence… which may or may not, itself, be a simulation)

###

As we play the odds, we might send dark birthday greetings to Friedrich Wilhelm Nietzsche; he was born on this date in 1844. A philosopher, cultural critic, composer, poet, and philologist, he and his work have had a profound influence on modern intellectual history.

Nietzsche became the youngest person ever to hold the Chair of Classical Philology at the University of Basel in 1869 at the age of 24, but resigned in 1879 due to health problems that plagued him most of his life. He completed much of his core writing in the following decade, before suffering a complete mental breakdown in 1889, after which he lived in care until his death in 1900.

Nietzsche’s writing spanned philosophical polemics, poetry, cultural criticism, and fiction, all the while displaying a fondness for aphorism and irony. He’s best remembered as a philosopher, for work that included his radical critique of truth in favor of perspectivism; for his genealogical critique of religion and Christian morality (and his related theory of master–slave morality); for is aesthetic affirmation of existence in response to his famous observation of the “death of God” and the profound crisis of nihilism; for his notion of the Apollonian and Dionysian; and for his characterization of the human subject as the expression of competing wills, collectively understood as the will to power. Nietzsche also developed influential concepts such as the Übermensch and the doctrine of eternal return.

After his death, his sister Elisabeth became the curator and editor of Nietzsche’s manuscripts. She edited his unpublished writings to fit her German nationalist beliefs– often contradicting or obfuscating Nietzsche’s stated opinions, which were explicitly opposed to antisemitism and nationalism. Through her published editions, Nietzsche’s work became associated with fascism and Nazism. But scholars contested this interpretation, and corrected editions of his writings were soon made available. Nietzsche’s thought enjoyed renewed popularity in the 1960s and his ideas have since had a profound impact on 20th and early-21st century thinkers across philosophy—especially in schools of continental philosophy such as existentialism, postmodernism and post-structuralism—as well as in art, literature, psychology, politics, and popular culture.

source

Written by LW

October 15, 2020 at 1:01 am

“Success is stumbling from failure to failure with no loss of enthusiasm”*…

Killed by Google is the Google graveyard; a free and open source list of discontinued Google services, products, devices, and apps. We aim to be a source of factual information about the history surrounding Google’s dead projects.

Contributors from around the world help compile, research, and maintain the information about dying and dead Google products. You can join the discussion on GitHub, or follow us on Twitter. A project by Cody Ogden.

206 projects, and counting– some have been supplanted by newer Google services; some, outmatched by competitors; and some… well, maybe just not such good ideas to begin with: “Killed By Google.”

* Winston Churchill

###

As we obsess over obsolescence, we might recall that it was on this date in 1995 that The Media Lab at the Massachusetts Institute of Technology chronicled the World Wide Web in its A Day in the Life of Cyberspace project.

To celebrate its 10th anniversary, the Media Lab had invited submissions for the days leading up to October 10, 1995, on a variety of issues related to technology and the Internet, including privacy, expression, age, wealth, faith, body, place, languages, and the environment.  Then on October 10, a team at MIT collected, edited, and published the contributions to “create a mosaic of life at the dawn of the digital revolution that is transforming our planet.”

source

“By far the greatest danger of Artificial Intelligence is that people conclude too early that they understand it”*…

What is it like to be a smartphone? In all the chatter about the future of artificial intelligence, the question has been glossed over or, worse, treated as settled. The longstanding assumption, a reflection of the anthropomorphic romanticism of computer scientists, science fiction writers, and internet entrepreneurs, has been that a self-aware computer would have a mind, and hence a consciousness, similar to our own. We, supreme programmers, would create machine consciousness in our own image.

The assumption is absurd, and not just because the sources and workings of our own consciousness remain unknown to us and hence unavailable as models for coders and engineers. Consciousness is entwined with being, and being with body, and a computer’s body and (speculatively) being have nothing in common with our own. A far more reasonable assumption is that the consciousness of a computer, should it arise, would be completely different from the consciousness of a human being. It would be so different that we probably wouldn’t even recognize it as a consciousness…

The Turing test, in all its variations, would also be useless in identifying an AI. It merely tests for a machine’s ability to feign likeness with ourselves. It provides no insight into the AI’s being, which, again, could be entirely separate from its ability to trick us into sensing it is like us. The Turing test tells us about our own skills; it says nothing about the character of the artificial being.

All of this raises another possibility. It may be that we are already surrounded by AIs but have no idea that they exist. Their beingness is invisible to us, just us ours is to them. We are both objects in the same place, but as beings we inhabit different universes. Our smartphones may right now be having, to borrow [philosopher Thomas] Nagel’s words, “experiences fully comparable in richness of detail to our own.”

Look at your phone. You see a mere tool, there to do your bidding, and perhaps that’s the way your phone sees you, the dutiful but otherwise unremarkable robot that from time to time plugs it into an electrical socket.

Nicholas Carr (@roughtype) applies a well-known thought-experiment on the nature of consciousness to artificial intelligence: “What is it like to be a smartphone?

See also here, the source of the image above.

* Eliezer Yudkowsky

###

As we wonder, we might spare a thought for Seymour Roger Cray; he died on this date in 1996.  An electrical engineer and computer architect, he designed a series of computers that were the fastest in the world for decades, and founded Cray Research which built many of these machines– effectively creating the supercomputer industry and earning the honorific “father of supercomputing.”

Seymour_Cray
With a Cray-1

source

“If you are not completely confused by quantum mechanics, you do not understand it”*…

 

uchicagoscie

 

If we can harness it, quantum technology promises fantastic new possibilities. But first, scientists need to coax quantum systems to stay yoked for longer than a few millionths of a second.

A team of scientists at the University of Chicago’s Pritzker School of Molecular Engineering announced the discovery of a simple modification that allows quantum systems to stay operational—or “coherent”—10,000 times longer than before. Though the scientists tested their technique on a particular class of quantum systems called solid-state qubits, they think it should be applicable to many other kinds of quantum systems and could thus revolutionize quantum communication, computing and sensing…

Down at the level of atoms, the world operates according to the rules of quantum mechanics—very different from what we see around us in our daily lives. These different rules could translate into technology like virtually unhackable networks or extremely powerful computers; the U.S. Department of Energy released a blueprint for the future quantum internet in an event at UChicago on July 23. But fundamental engineering challenges remain: Quantum states need an extremely quiet, stable space to operate, as they are easily disturbed by background noise coming from vibrations, temperature changes or stray electromagnetic fields.

Thus, scientists try to find ways to keep the system coherent as long as possible…

“This breakthrough lays the groundwork for exciting new avenues of research in quantum science,” said study lead author David Awschalom, the Liew Family Professor in Molecular Engineering, senior scientist at Argonne National Laboratory and director of the Chicago Quantum Exchange. “The broad applicability of this discovery, coupled with a remarkably simple implementation, allows this robust coherence to impact many aspects of quantum engineering. It enables new research opportunities previously thought impractical.”…

Very big news at a very small scale: “Scientists discover way to make quantum states last 10,000 times longer.”

*John Wheeler

###

As we strive for stability, we might send calculated birthday greetings to Brook Taylor; he was born on this date in 1685.  A mathematician, he is best known for his work in describing and understanding oscillation.  In 1708, Taylor produced a solution to the problem of the center of oscillation.  His Methodus incrementorum directa et inversa (“Direct and Indirect Methods of Incrementation,” 1715) introduced what is now called the calculus of finite differences.  Using this, he was the first to express mathematically the movement of a vibrating string on the basis of mechanical principles.  Methodus also contained Taylor’s theorem, later recognized by Joseph Lagrange as the basis of differential calculus.

A gifted artist, Taylor also wrote on the basic principles of perspective, including the first general treatment of the principle of vanishing points.

220px-BTaylor source

 

 

%d bloggers like this: