Posts Tagged ‘network’
“I fear the day when the technology overlaps with our humanity. The world will only have a generation of idiots.”*…
Alva Noë on the importance of humans hanging on to their humanity– for all the promise and dangers of AI, computers plainly can’t think. To think is to resist – something no machine does:
Computers don’t actually do anything. They don’t write, or play; they don’t even compute. Which doesn’t mean we can’t play with computers, or use them to invent, or make, or problem-solve. The new AI is unexpectedly reshaping ways of working and making, in the arts and sciences, in industry, and in warfare. We need to come to terms with the transformative promise and dangers of this new tech. But it ought to be possible to do so without succumbing to bogus claims about machine minds.
What could ever lead us to take seriously the thought that these devices of our own invention might actually understand, and think, and feel, or that, if not now, then later, they might one day come to open their artificial eyes thus finally to behold a shiny world of their very own? One source might simply be the sense that, now unleashed, AI is beyond our control. Fast, microscopic, distributed and astronomically complex, it is hard to understand this tech, and it is tempting to imagine that it has power over us.
But this is nothing new. The story of technology – from prehistory to now – has always been that of the ways we are entrained by the tools and systems that we ourselves have made. Think of the pathways we make by walking. To every tool there is a corresponding habit, that is, an automatised way of acting and being. From the humble pencil to the printing press to the internet, our human agency is enacted in part by the creation of social and technological landscapes that in turn transform what we can do, and so seem, or threaten, to govern and control us.
Yet it is one thing to appreciate the ways we make and remake ourselves through the cultural transformation of our worlds via tool use and technology, and another to mystify dumb matter put to work by us. If there is intelligence in the vicinity of pencils, shoes, cigarette lighters, maps or calculators, it is the intelligence of their users and inventors. The digital is no different.
But there is another origin of our impulse to concede mind to devices of our own invention, and this is what I focus on here: the tendency of some scientists to take for granted what can only be described as a wildly simplistic picture of human and animal cognitive life. They rely unchecked on one-sided, indeed, milquetoast conceptions of human activity, skill and cognitive accomplishment. The surreptitious substitution (to use a phrase of Edmund Husserl’s) of this thin gruel version of the mind at work – a substitution that I hope to convince you traces back to Alan Turing and the very origins of AI – is the decisive move in the conjuring trick.
What scientists seem to have forgotten is that the human animal is a creature of disturbance. Or as the mid-20th-century philosopher of biology Hans Jonas wrote: ‘Irritability is the germ, and as it were the atom, of having a world…’ With us there is always, so to speak, a pebble in the shoe. And this is what moves us, turns us, orients us to reorient ourselves, to do things differently, so that we might carry on. It is irritation and disorientation that is the source of our concern. In the absence of disturbance, there is nothing: no language, no games, no goals, no tasks, no world, no care, and so, yes, no consciousness…
[Starting with Turing, Noë considers the relative roles of humans and technology across a number of spheres, including music…]
… The piano was invented, to be sure, but not by you or me. We encounter it. It pre-exists us and solicits our submission. To learn to play is to be altered, made to adapt one’s posture, hands, fingers, legs and feet to the piano’s mechanical requirements. Under the regime of the piano keyboard, it is demanded that we ourselves become player pianos, that is to say, extensions of the machine itself.
But we can’t. And we won’t. To learn to play, to take on the machine, for us, is to struggle. It is hard to master the instrument’s demands.
And this fact – the difficulty we encounter in the face of the keyboard’s insistence – is productive. We make art out of it. It stops us being player pianos, but it is exactly what is required if we are to become piano players.
For it is the player’s fraught relation to the machine, and to the history and tradition that the machine imposes, that supplies the raw material of musical invention. Music and play happen in that entanglement. To master the piano, as only a person can, is not just to conform to the machine’s demands. It is, rather, to push back, to say no, to rage against the machine. And so, for example, we slap and bang and shout out. In this way, the piano becomes not merely a vehicle of habit and control – a mechanism – but rather an opportunity for action and expression.
And, as with the piano, so with the whole of human cultural life. We live in the entanglement between government and resistance. We fight back…
… The telling fact: computers are used to play our games; they are engineered to make moves in the spaces opened up by our concerns. They don’t have concerns of their own, and they make no new games. They invent no new language.
The British philosopher R G Collingwood noticed that the painter doesn’t invent painting, and the musician doesn’t invent the musical culture in which they find themselves. And for Collingwood this served to show that no person is fully autonomous, a God-like fount of creativity; we are always to some degree recyclers and samplers and, at our best, participants in something larger than ourselves.
But this should not be taken to show that we become what we are (painters, musicians, speakers) by doing what, for example, LLMs do – i.e., merely by getting trained up on large data sets. Humans aren’t trained up. We have experience. We learn. And for us, learning a language, for example, isn’t learning to generate ‘the next token’. It’s learning to work, play, eat, love, flirt, dance, fight, pray, manipulate, negotiate, pretend, invent and think. And crucially, we don’t merely incorporate what we learn and carry on; we always resist. Our values are always problematic. We are not merely word-generators. We are makers of meaning.
We can’t help doing this; no computer can do this…
Eminently worth reading in full: “Rage against the machine,” from @alvanoe in @aeonmag.
For more, see Noë’s The Entanglement: How Art and Philosophy Make Us What We Are.
* Albert Einstein
###
As we resolve to wrestle, we might recall that it was on this date in 1969 that UCLA professor Leonard Kleinrock (aided by his student assistant Charley Kline) created the first networked computer-to-computer connection (with SRI programmer Bill Duvall in Palo Alto), via which they sent the first networked computer-to-computer communication)… or at least part of it. Duvall’s machine crashed partway through the transmission, meaning the only letters received from the attempted “login” were “lo.” The next month two more nodes were added (UCSB and the University of Utah) and the network was dubbed ARPANET.
Still, “lo”– perhaps an appropriate way to announce what would grow up to be the internet.

“Those who can imagine anything, can create the impossible”*…
As Charlie Wood explains, physicists are building neural networks out of vibrations, voltages and lasers, arguing that the future of computing lies in exploiting the universe’s complex physical behaviors…
… When it comes to conventional machine learning, computer scientists have discovered that bigger is better. Stuffing a neural network with more artificial neurons — nodes that store numerical values — improves its ability to tell a dachshund from a Dalmatian, or to succeed at myriad other pattern recognition tasks. Truly tremendous neural networks can pull off unnervingly human undertakings like composing essays and creating illustrations. With more computational muscle, even grander feats may become possible. This potential has motivated a multitude of efforts to develop more powerful and efficient methods of computation.
[Cornell’s Peter McMahon] and a band of like-minded physicists champion an unorthodox approach: Get the universe to crunch the numbers for us. “Many physical systems can naturally do some computation way more efficiently or faster than a computer can,” McMahon said. He cites wind tunnels: When engineers design a plane, they might digitize the blueprints and spend hours on a supercomputer simulating how air flows around the wings. Or they can stick the vehicle in a wind tunnel and see if it flies. From a computational perspective, the wind tunnel instantly “calculates” how wings interact with air.
A wind tunnel is a single-minded machine; it simulates aerodynamics. Researchers like McMahon are after an apparatus that can learn to do anything — a system that can adapt its behavior through trial and error to acquire any new ability, such as classifying handwritten digits or distinguishing one spoken vowel from another. Recent work has shown that physical systems like waves of light, networks of superconductors and branching streams of electrons can all learn.
“We are reinventing not just the hardware,” said Benjamin Scellier, a mathematician at the Swiss Federal Institute of Technology Zurich in Switzerland who helped design a new physical learning algorithm, but “also the whole computing paradigm.”…
Computing at the largest scale? “How to Make the Universe Think for Us,” from @walkingthedot in @QuantaMagazine.
###
As we think big, we might send well-connected birthday greetings to Leonard Kleinrock; he was born on this date in 1934. A computer scientist, he made several foundational contributions the field, in particular to the theoretical foundations of data communication in computer networking. Perhaps most notably, he was central to the development of ARPANET (which essentially grew up to be the internet); his graduate students at UCLA were instrumental in developing the communication protocols for internetworking that made that possible.

“In the attempt to make scientific discoveries, every problem is an opportunity and the more difficult the problem, the greater will be the importance of its solution”*…
(Roughly) Daily is headed into its traditional Holiday hibernation; regular service will begin again very early in the New Year.
It seems appropriate (especially given the travails of this past year) to end the year on a positive and optimistic note, with a post celebrating an extraordinary accomplishment– Science magazine‘s (thus, the AAAS‘) “Breakthrough of the Year” for 2021…
In his 1972 Nobel Prize acceptance speech, American biochemist Christian Anfinsen laid out a vision: One day it would be possible, he said, to predict the 3D structure of any protein merely from its sequence of amino acid building blocks. With hundreds of thousands of proteins in the human body alone, such an advance would have vast applications, offering insights into basic biology and revealing promising new drug targets. Now, after nearly 50 years, researchers have shown that artificial intelligence (AI)-driven software can churn out accurate protein structures by the thousands—an advance that realizes Anfinsen’s dream and is Science’s 2021 Breakthrough of the Year.
AI-powered predictions show proteins finding their shapes: the full story: “Protein structures for all.”
And read Nature‘s profile of the scientist behind the breakthrough: “John Jumper: Protein predictor.”
* E. O. Wilson
###
As we celebrate science, we might send well-connected birthday greetings to Robert Elliot Kahn; he was born on this date in 1938. An electrical engineer and computer scientist, he and his co-creator, Vint Cerf, first proposed the Transmission Control Protocol (TCP) and the Internet Protocol (IP), the fundamental communication protocols at the heart of the Internet. Later, he and Vint, along with fellow computer scientists Lawrence Roberts, Paul Baran, and Leonard Kleinrock, built the ARPANET, the first network to successfully link computers around the country.
Kahn has won the Turing Award, the National Medal of Technology, and the Presidential Medal Of Freedom, among many, many other awards and honors.
“Beware of him who would deny you access to information, for in his heart he dreams himself your master”*…
Stewart Brand once suggested that “Information wants to be free. Information also wants to be expensive. …That tension will not go away.” Indeed, it seems to be growing…
Aaron Swartz was 26 years old when he took his own life. He did so under the shadow of legal prosecution, pursued by government lawyers intent on maximal punishment. If found guilty, he potentially faced up to 50 years in prison and a $1 million dollar fine. Swartz’s crime was not only legal, but political. He had accessed a private computer network and gained possession of highly valuable information with the goal of sharing it. His actions threatened some of the most powerful, connected, and politically protected groups in the country. Their friends in the government were intent on sending a message.
It’s the kind of story you would expect about some far-off political dissident. But Swartz took his life in Brooklyn on a winter day in 2013 and his prosecutor was the U.S. federal government. When Swartz died, he was under indictment for 13 felony charges related to his use of an MIT computer to download too many scientific articles from the academic database JSTOR, ostensibly for the purpose of making them freely available to the public. Ultimately, Swartz potentially faced more jail time for downloading academic papers than he would have if he had helped Al Qaeda build a nuclear weapon. Even the Criminal Code of the USSR stipulated that those who stored and distributed anti-Soviet literature only faced five to seven years in prison. While prosecutors later pointed toward a potential deal for less time, Aaron would still have been labeled a felon for his actions—and to boot, JSTOR itself had reached a civil settlement and didn’t even pursue its own lawsuit.
But Aaron’s cause lived on. This September marks the ten-year anniversary of Sci-Hub, the online “shadow library” that provides access to millions of research papers otherwise hidden behind prohibitive paywalls. Founded by the Kazakhstani computer scientist Alexandra Elbakyan—popularly known as science’s “pirate queen”—Sci-Hub has grown to become a repository of over 85 million academic papers.
The site is popular globally, used by millions of people—many of whom would otherwise not be able to finish their degrees, advise their patients, or use text mining algorithms to make new scientific discoveries. Sci-Hub has become the unacknowledged foundation that helps the whole enterprise of academia to function.
Even when they do not need to use Sci-Hub, the superior user experience it offers means that many people prefer to use the illegal site rather than access papers through their own institutional libraries. It is difficult to say how many ideas, grants, publications, and companies have been made possible by Sci-Hub, but it seems undeniable that Elbakyan’s ten-year-old website has become a crucial component of contemporary scholarship.
The success of Sci-Hub has made it a target for injunctions and investigations. Academic publishers have sued Sci-Hub repeatedly, opponents have accused the site’s creators of collaborating with Russian intelligence, and private sector critics have dubbed it a threat to “national security.” Elbakyan recently tweeted out a notification she received that the FBI had requested her personal data from Apple.
Whatever happens to Sci-Hub or Elbakyan, the fact that such a site exists is something of a tragedy. Sci-Hub currently fills a niche that should never have existed. Like the black-market medicine purchased by people who cannot afford prescription drugs, its very being indicts the official system that created the conditions of its emergence…
The cost of individually purchasing all the articles required to complete a typical literature review could easily amount to thousands of dollars. Beyond paying for the articles themselves, academics often have to pay steep fees to publish their research. Meanwhile, most peer reviewers and editors charged with assessing, correcting, and formatting papers do not receive compensation for their work.
It’s particularly ironic that this situation exists alongside a purported digital “infodemic” of misinformation. The costs of this plague are massive, from opposition to the pandemic response to the conspiracy theories that drove a loving father to fire his gun inside a pizza parlor and a man to kill a mafia boss accused of having ties to the deep state. But few public figures, if any, draw the direct connection between the expensive barricades around scientific research and the conspicuous epistemic collapse of significant portions of the American political discourse…
Whether intended or not, the impossibly high paywalls of academic publishers only serve to keep scientific information out of the population’s hands. What makes this even more discordant is that the people being prevented from accessing the information are often also the taxpayers who helped fund the research in the first place.
By framing the debate about Sci-Hub as one concerning property rights, both advocates of Elbakyan’s site and its detractors fall afoul of what John Gall called the “operational fallacy.” In his book The Systems Bible, Gall defined the operational fallacy as a situation where “the system itself does not do what it says it is doing.” In other words, what a system calls itself is not always a reliable indicator of its true function. In this case, the name of the “academic publishing industry” implies that it is supposed to be involved in the dissemination of scholarship. But the effective function of the academic publishing industry as it actually exists is to prevent the dissemination of scholarly work.
Given the example of Sci-Hub, the easy logistics of internet publication, and the funding structure of academic research, it seems clear that in the absence of the academic publishing industry, scholarship would be more widely available, not less. If the academic publishing industry did not exist, scientists could still do their research—in fact, it would be easier to do so as more people would have access to scholarly literature. The peer-review process could still function normally—though there are good reasons to change that as well. And the resulting papers could simply be posted in a place where anyone could read them.
When we explore the actual function of the academic publishing industry—restricting access to scholarly research—we see that these publishers have little in common with the companies that have opposed other file-sharing sites. When several record companies sued Napster in 2001, they could make the legitimate case that the economic well-being of the musicians, producers, and all the people who were needed to distribute recorded music was at stake. No such parallel exists in the case of Sci-Hub. Scientists are not paid by the publishers. Peer reviewers are not paid by the publishers. Distribution itself, as proven by Sci-Hub and its more law-abiding brother arXiv, is cheap enough to be provided to the public for free. It’s not surprising, then, that polls reveal that scientists overwhelmingly support Sci-Hub…
Eminently worth reading in full– the civic tragedy of academic publishing: “A World Without Sci-Hub,” from Jason Rhys Perry (@JRhysParry) in @palladiummag.
###
As we share and share alike, we might recall that it was on this date in 1970 that the Public Broadcasting Service– PBS– premiered, when it took over (most of) the functions of its predecessor, National Educational Television.
Unlike the five major commercial broadcast television networks in the United States (ABC, CBS, NBC, Fox, and The CW) PBS is technically not a network, but rather a program distributor that provides television content and related services to its member stations. Each station sets its own schedule and programs local content (e.g., local/state news, interviews, cultural, and public affairs programs) that supplements content provided by PBS and other public television distributors.








You must be logged in to post a comment.