(Roughly) Daily

Posts Tagged ‘network

“In the attempt to make scientific discoveries, every problem is an opportunity and the more difficult the problem, the greater will be the importance of its solution”*…

(Roughly) Daily is headed into its traditional Holiday hibernation; regular service will begin again very early in the New Year.

It seems appropriate (especially given the travails of this past year) to end the year on a positive and optimistic note, with a post celebrating an extraordinary accomplishment– Science magazine‘s (thus, the AAAS‘) “Breakthrough of the Year” for 2021…

In his 1972 Nobel Prize acceptance speech, American biochemist Christian Anfinsen laid out a vision: One day it would be possible, he said, to predict the 3D structure of any protein merely from its sequence of amino acid building blocks. With hundreds of thousands of proteins in the human body alone, such an advance would have vast applications, offering insights into basic biology and revealing promising new drug targets. Now, after nearly 50 years, researchers have shown that artificial intelligence (AI)-driven software can churn out accurate protein structures by the thousands—an advance that realizes Anfinsen’s dream and is Science’s 2021 Breakthrough of the Year.

AI-powered predictions show proteins finding their shapes: the full story: “Protein structures for all.”

And read Nature‘s profile of the scientist behind the breakthrough: “John Jumper: Protein predictor.”

* E. O. Wilson

###

As we celebrate science, we might send well-connected birthday greetings to Robert Elliot Kahn; he was born on this date in 1938. An electrical engineer and computer scientist, he and his co-creator, Vint Cerf, first proposed the Transmission Control Protocol (TCP) and the Internet Protocol (IP), the fundamental communication protocols at the heart of the Internet. Later, he and Vint, along with fellow computer scientists Lawrence Roberts, Paul Baran, and Leonard Kleinrock, built the ARPANET, the first network to successfully link computers around the country.

Kahn has won the Turing Award, the National Medal of Technology, and the Presidential Medal Of Freedom, among many, many other awards and honors.

source

“Beware of him who would deny you access to information, for in his heart he dreams himself your master”*…

NOAA/Plotting the position of the survey ship PATHFINDER, Alaska

Stewart Brand once suggested that “Information wants to be free. Information also wants to be expensive. …That tension will not go away.” Indeed, it seems to be growing…

Aaron Swartz was 26 years old when he took his own life. He did so under the shadow of legal prosecution, pursued by government lawyers intent on maximal punishment. If found guilty, he potentially faced up to 50 years in prison and a $1 million dollar fine. Swartz’s crime was not only legal, but political. He had accessed a private computer network and gained possession of highly valuable information with the goal of sharing it. His actions threatened some of the most powerful, connected, and politically protected groups in the country. Their friends in the government were intent on sending a message.

It’s the kind of story you would expect about some far-off political dissident. But Swartz took his life in Brooklyn on a winter day in 2013 and his prosecutor was the U.S. federal government. When Swartz died, he was under indictment for 13 felony charges related to his use of an MIT computer to download too many scientific articles from the academic database JSTOR, ostensibly for the purpose of making them freely available to the public. Ultimately, Swartz potentially faced more jail time for downloading academic papers than he would have if he had helped Al Qaeda build a nuclear weapon. Even the Criminal Code of the USSR stipulated that those who stored and distributed anti-Soviet literature only faced five to seven years in prison. While prosecutors later pointed toward a potential deal for less time, Aaron would still have been labeled a felon for his actions—and to boot, JSTOR itself had reached a civil settlement and didn’t even pursue its own lawsuit.

But Aaron’s cause lived on. This September marks the ten-year anniversary of Sci-Hub, the online “shadow library” that provides access to millions of research papers otherwise hidden behind prohibitive paywalls. Founded by the Kazakhstani computer scientist Alexandra Elbakyan—popularly known as science’s “pirate queen”—Sci-Hub has grown to become a repository of over 85 million academic papers.

The site is popular globally, used by millions of people—many of whom would otherwise not be able to finish their degrees, advise their patients, or use text mining algorithms to make new scientific discoveries. Sci-Hub has become the unacknowledged foundation that helps the whole enterprise of academia to function. 

Even when they do not need to use Sci-Hub, the superior user experience it offers means that many people prefer to use the illegal site rather than access papers through their own institutional libraries. It is difficult to say how many ideas, grants, publications, and companies have been made possible by Sci-Hub, but it seems undeniable that Elbakyan’s ten-year-old website has become a crucial component of contemporary scholarship.  

The success of Sci-Hub has made it a target for injunctions and investigations. Academic publishers have sued Sci-Hub repeatedly, opponents have accused the site’s creators of collaborating with Russian intelligence, and private sector critics have dubbed it a threat to “national security.” Elbakyan recently tweeted out a notification she received that the FBI had requested her personal data from Apple. 

Whatever happens to Sci-Hub or Elbakyan, the fact that such a site exists is something of a tragedy. Sci-Hub currently fills a niche that should never have existed. Like the black-market medicine purchased by people who cannot afford prescription drugs, its very being indicts the official system that created the conditions of its emergence… 

The cost of individually purchasing all the articles required to complete a typical literature review could easily amount to thousands of dollars. Beyond paying for the articles themselves, academics often have to pay steep fees to publish their research. Meanwhile, most peer reviewers and editors charged with assessing, correcting, and formatting papers do not receive compensation for their work. 

It’s particularly ironic that this situation exists alongside a purported digital “infodemic” of misinformation. The costs of this plague are massive, from opposition to the pandemic response to the conspiracy theories that drove a loving father to fire his gun inside a pizza parlor and a man to kill a mafia boss accused of having ties to the deep state. But few public figures, if any, draw the direct connection between the expensive barricades around scientific research and the conspicuous epistemic collapse of significant portions of the American political discourse…

Whether intended or not, the impossibly high paywalls of academic publishers only serve to keep scientific information out of the population’s hands. What makes this even more discordant is that the people being prevented from accessing the information are often also the taxpayers who helped fund the research in the first place. 

By framing the debate about Sci-Hub as one concerning property rights, both advocates of Elbakyan’s site and its detractors fall afoul of what John Gall called the “operational fallacy.” In his book The Systems Bible, Gall defined the operational fallacy as a situation where “the system itself does not do what it says it is doing.” In other words, what a system calls itself is not always a reliable indicator of its true function. In this case, the name of the “academic publishing industry” implies that it is supposed to be involved in the dissemination of scholarship. But the effective function of the academic publishing industry as it actually exists is to prevent the dissemination of scholarly work. 

Given the example of Sci-Hub, the easy logistics of internet publication, and the funding structure of academic research, it seems clear that in the absence of the academic publishing industry, scholarship would be more widely available, not less. If the academic publishing industry did not exist, scientists could still do their research—in fact, it would be easier to do so as more people would have access to scholarly literature. The peer-review process could still function normally—though there are good reasons to change that as well. And the resulting papers could simply be posted in a place where anyone could read them. 

When we explore the actual function of the academic publishing industry—restricting access to scholarly research—we see that these publishers have little in common with the companies that have opposed other file-sharing sites. When several record companies sued Napster in 2001, they could make the legitimate case that the economic well-being of the musicians, producers, and all the people who were needed to distribute recorded music was at stake. No such parallel exists in the case of Sci-Hub. Scientists are not paid by the publishers. Peer reviewers are not paid by the publishers. Distribution itself, as proven by Sci-Hub and its more law-abiding brother arXiv, is cheap enough to be provided to the public for free. It’s not surprising, then, that polls reveal that scientists overwhelmingly support Sci-Hub…  

Eminently worth reading in full– the civic tragedy of academic publishing: “A World Without Sci-Hub,” from Jason Rhys Perry (@JRhysParry) in @palladiummag.

Sid Meier

###

As we share and share alike, we might recall that it was on this date in 1970 that the Public Broadcasting Service– PBS– premiered, when it took over (most of) the functions of its predecessor, National Educational Television.

Unlike the five major commercial broadcast television networks in the United States (ABC, CBS, NBC, Fox, and The CW) PBS is technically not a network, but rather a program distributor that provides television content and related services to its member stations. Each station sets its own schedule and programs local content (e.g., local/state news, interviews, cultural, and public affairs programs) that supplements content provided by PBS and other public television distributors.

source

“Attend to mushrooms and all other things will answer up”*…

Travis Boyer: Crush Blue, 2020

The living– and conscious?– infrastructure of the biosphere…

Imagine that you are afloat on your back in the sea. You have some sense of its vast, unknowable depths—worlds of life are surely darting about beneath you. Now imagine lying in a field, or on the forest floor. The same applies, though we rarely think of it: the dirt beneath you, whether a mile or a foot deep, is teeming with more organisms than researchers can quantify. Their best guess is that there are as many as one billion microbes in a single teaspoon of soil. Plant roots plunge and swerve like superhighways with an infinite number of on-ramps. And everywhere there are probing fungi.

Fungi are classified as their own kingdom, separate from plants and animals. They are often microscopic and reside mostly out of sight—mainly underground—but as Merlin Sheldrake writes in Entangled Life: How Fungi Make Our Worlds, Change Our Minds and Shape Our Futures, they support and sustain nearly all living systems. Fungi are nature’s premiere destroyers and creators, digesting the world’s dead and leaving behind new soil. When millions of hair-like fungal threads—called hyphae—coalesce, felting themselves into complex shapes, they emerge from the ground as mushrooms. A mushroom is to a fungus as a pear is to a pear tree: the organism’s fruiting body, with spores instead of seeds. Mushrooms disperse spores by elaborate means: some species generate puffs of air to send them aloft, while others eject them by means of tiny, specialized catapults so they accelerate ten thousand times faster than a space shuttle during launch.

But Sheldrake is most interested in fungi’s other wonders—specifically, how they challenge our understanding of nonhuman intelligence and stretch the notion of biological individuality. Fungi infiltrate the roots of almost every plant, determining so much about its life that researchers are now asking whether plants can be considered plants without them. They are similarly interwoven throughout the human body, busily performing functions necessary to our health and well-being or, depending on the fungi’s species and lifestyle, wreaking havoc. All of this prompts doubts about what we thought we knew to be the boundaries between one organism and another…

ungi themselves form large networks of hyphae strands in order to feed. These strands, when massed together, are called mycelium. The total length of mycelium threaded through the globe’s uppermost four inches of soil is believed to be enough to span half the width of our galaxy. Mycelium is constantly moving, probing its surroundings in every direction and coordinating its movements over long distances. When food is found—a nice chunk of rotting wood, for example—disparate parts of the mycelium redirect to coalesce around it, excrete enzymes that digest it externally, and then absorb it. As Sheldrake puts it, “The difference between animals and fungi is simple: Animals put food in their bodies, whereas fungi put their bodies in the food.”

Fungi are literally woven into the roots and bodies of nearly every plant grown in natural conditions. “A plant’s fungal partners,” Sheldrake writes, “can have a noticeable impact on its growth.” In one striking example, he describes an experiment in which strawberries grown with different fungal partners changed their sweetness and shape. Bumblebees seemed able to discern the difference and were more attracted to the flowers of strawberry plants grown with certain fungal species. Elsewhere he discusses an experiment in which researchers took fungi that inhabited the roots of a species of coastal grass that grew readily in saltwater and added it to a dry-land grass that could not tolerate the sea. Suddenly the dry-land grass did just fine in brine.

Much has been written lately about trees communicating and sharing resources among themselves; healthy trees have been documented moving resources toward trees that have fallen ill. This is often characterized as friendship or altruism between trees, but it is not at all clear whether trees pass information or nutrients intentionally. What is clear, though, is that the fungal networks entwined in every tree root make this communication possible. “Why might it benefit a fungus to pass a warning between the multiple plants that it lives with?” Sheldrake asks. The answer is survival. “If a fungus is connected to several plants and one is attacked by aphids, the fungus will suffer as well as the plant,” he writes. “It is the fungus that stands to benefit from keeping the healthy plant alive.”…

Fungi are genetically closer to animals than to plants, and similar enough to humans at the molecular level that we benefit from many of their biochemical innovations. In fact, many of our pharmaceuticals are borrowed innovations from fungi. Penicillin, discovered in 1928 by the Scottish researcher Alexander Fleming, is a compound produced by fungus for protection against bacterial infection. The anti-cancer drug Taxol was originally isolated from the fungi that live inside yew trees. More than half of all enzymes used in industry are generated by fungi, Sheldrake notes, and 15 percent of all vaccines are produced using yeast. We are, as he puts it, “borrowing a fungal solution and rehousing it within our own bodies.”..

We know that fungi maintain “countless channels of chemical communication with other organisms,” and that they are constantly processing diverse information about their environment. Some can recognize color, thanks to receptors sensitive to blue and red light, though it is not entirely clear what they do with that information. Some even have opsins, light-detecting proteins also found within the rods and cones of the animal eye. One fungus, Phycomyces blakesleeanus, has a sensitivity to light similar to that of a human eye and can “detect light at levels as low as that provided by a single star” to help it decide where to grow. It is also able to sense the presence of nearby objects and will bend away from them before ever making contact. Still other fungi recognize texture; according to Sheldrake, the bean rust fungus has been demonstrated to detect grooves in artificial surfaces “three times shallower than the gap between the laser tracks on a CD.”

Can fungi, then, be said to have a mind of their own? That is, as Sheldrake puts it, a “question of taste”—there is no settled scientific definition for “intelligence,” not even for animals. The Latin root of the word means “to choose between,” an action fungi clearly do all the time. But the application of this kind of term to fungi is loaded with something more mystical than that simple definition and demands a willingness to rattle our sense of where we ourselves fall in the imagined hierarchy of life. If fungi can be said to think, it is a form of cognition so utterly different that we strain to see it.

After all, philosophers of mind like Daniel Dennett argue that drawing any neat line between nonhumans and humans with “real minds” is an “archaic myth.” Our brains evolved from nonmental material. “Brains are just one such network,” Sheldrake writes, “one way of processing information.” We still don’t know how the excitement of brain cells gives rise to experience. Can we really dismiss the possibility of cognition in an organism that clearly adapts, learns, and makes decisions simply based on the lack of a brain structure analogous to ours?

Perhaps there is intelligent life all around us, and our view is too human-centric to notice. Are fungi intelligent? Sheldrake reserves judgment, deferring instead to scientific mystery: “A sophisticated understanding of mycelium is yet to emerge.” Still, after spending long enough in the atmosphere of Sheldrake’s sporulating mind, I began to adopt the fungal perspective. I can’t help now but see something like a mind wherever there might be fungal threads—which is to say everywhere, a mesh-like entangled whole, all over the earth.

Fungi challenge our understanding of nonhuman intelligence and complicate the boundaries between one organism and another: “Our Silent Partners“– Zoë Schlanger (@zoeschlanger) reviewing Merlin Sheldrake’s Entangled Life: How Fungi Make Our Worlds, Change Our Minds and Shape Our Futures in @nybooks.

Why did the mushroom go to the party? Because he was a fungi.” – Lewis Tomlinson

* A. R. Ammons

###

As we ponder partnership, we might spare a thought for Jens Wilhelm August Lind; he died on this date in 1939. An apothecary, botanist and mycologist, he published a full account of all fungi collected in Denmark by his teacher, Emil Rostrup. Combining his pharmaceutical and mycological knowledge, he was early in experimenting with chemical control of plant pathogens.

Lind also collaborated with Knud Jessen on an account on the immigration history of weeds to Denmark.

Gravestone of Jens Lind and wife Gunild, at Viborg Cemetery

source

“Arguing that you don’t care about the right to privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say”*…

There’s a depressing sort of symmetry in the fact that our modern paradigms of privacy were developed in response to the proliferation of photography and their exploitation by tabloids. The seminal 1890 Harvard Law Review article The Right to Privacy—which every essay about data privacy is contractually obligated to cite—argued that the right of an individual to object to the publication of photographs ought to be considered part of a general ‘right to be let alone’.

30 years on, privacy is still largely conceived of as an individual thing, wherein we get to make solo decisions about when we want to be left alone and when we’re comfortable being trespassed upon. This principle undergirds the notice-and-consent model of data management, which you might also know as the pavlovian response to click “I agree” on any popup and login screen with little regard for the forty pages of legalese you might be agreeing to.

The thing is, the right to be left alone makes perfect sense when you’re managing information relationships between individuals, where there are generally pretty clear social norms around what constitutes a boundary violation. Reasonable people can and do disagree as to the level of privacy they expect, but if I invite you into my home and you snoop through my bedside table and read my diary, there isn’t much ambiguity about that being an invasion.

But in the age of ✨ networked computing ✨, this individual model of privacy just doesn’t scale anymore. There are too many exponentially intersecting relationships for any of us to keep in our head. It’s no longer just about what we tell a friend or the tax collector or even a journalist. It’s the digital footprint that we often unknowingly leave in our wake every time we interact with something online, and how all of those websites and apps and their shadowy partners talk to each other behind our backs. It’s the cameras in malls tracking our location and sometimes emotions, and it’s the license plate readers compiling a log of our movements.

At a time when governments and companies are increasingly investing in surveillance mechanisms under the guise of security and transparency, that scale is only going to keep growing. Our individual comfort about whether we are left alone is no longer the only, or even the most salient part of the story, and we need to think about privacy as a public good and a collective value.

I like thinking about privacy as being collective, because it feels like a more true reflection of the fact that our lives are made up of relationships, and information about our lives is social and contextual by nature. The fact that I have a sister also indicates that my sister has at least one sibling: me. If I took a DNA test through 23andme I’m not just disclosing information about me but also about everyone that I’m related to, none of whom are able to give consent. The privacy implications for familial DNA are pretty broad: this information might be used to sell or withhold products and services, expose family secrets, or implicate a future as-yet-unborn relative in a crime. I could email 23andme and ask them to delete my records, and they might eventually comply in a month or three. But my present and future relatives wouldn’t be able to do that, or even know that their privacy had been compromised at all.

Even with data that’s less fraught than our genome, our decisions about what we expose to the world have externalities for the people around us. I might think nothing of posting a photo of going out with my friends and mentioning the name of the bar, but I’ve just exposed our physical location to the internet. If one of my friends has had to deal with a stalker in their past, I could’ve put their physical safety at risk. Even if I’m careful to make the post friends-only, the people I trust are not the same as the people my friends trust. In an individual model of privacy, we are only as private as our least private friend.

Amidst the global pandemic, this might sound not dissimilar to public health. When I decide whether to wear a mask in public, that’s partially about how much the mask will protect me from airborne droplets. But it’s also—perhaps more significantly—about protecting everyone else from me.

Data collection isn’t always bad, but it is always risky. Sometimes that’s due to shoddy design and programming or lazy security practices. But even the best engineers often fail to build risk-free systems, by the very nature of systems.

Systems are easier to attack than they are to defend. If you want to defend a system, you have to make sure every part of it is perfectly implemented to guard against any possible vulnerabilities. Oftentimes, trying to defend a system means adding additional components, which just ends up creating more potential weak points. Whereas if you want to attack, all you have to do is find the one weakness that the systems designer missed. (Or, to paraphrase the IRA, you only have to be lucky once.)

This is true of all systems, digital or analog, but the thing that makes computer systems particularly vulnerable is that the same weaknesses can be deployed across millions of devices, in our phones and laptops and watches and toasters and refrigerators and doorbells. When a vulnerability is discovered in one system, an entire class of devices around the world is instantly a potential target, but we still have to go fix them one by one.

This is how the Equifax data leak happened. Equifax used a piece of open source software that had a security flaw in it, the people who work on that software found it and fixed it, and instead of diligently updating their systems Equifax hit the snooze button for four months and let hackers steal hundreds of millions of customer records. And while Equifax is definitely guilty of aforementioned lazy security practices, this incident also illustrates how fragile computer systems are. From the moment this bug was discovered, every server in the world that ran that software was at risk.

What’s worse, in many cases people weren’t even aware that their data was stored with Equifax. If you’re an adult who has had a job or a phone bill or interacted with a bank in the last seven years, your identifying information is collected by Equifax whether you like it or not. The only way to opt out would have been to be among the small percentage of overwhelmingly young, poor, and racialized people who have no credit histories, which significantly limits the scope of their ability to participate in the economy. How do you notice-and-consent your way out of that?

There unfortunately isn’t one weird trick to save democracy, but that doesn’t mean there aren’t lessons we can learn from history to figure out how to protect privacy as a public good. The scale and ubiquity of computers may be unprecedented, but so is the scale of our collective knowledge…

Read the full piece (and you should) for Jenny Zhang‘s (@phirephoenix) compelling case that we should treat– and protect– privacy as a public good, and explanation of how we might do that: “Left alone, together.” TotH to Sentiers.

[image above: source]

* Edward Snowden

###

As we think about each other, we might recall that it was on this date in 1939 that the first government appropriation was made to the support the construction of the Harvard Mark I computer.

Designer Howard Aiken had enlisted IBM as a partner in 1937; company chairman Thomas Watson Sr. personally approved the project and its funding. It was completed in 1944 (and put to work on a set war-related tasks, including calculations– overseen by John von Neumann— for the Manhattan Project). 

The Mark I was the industry’s largest electromechanical calculator… and it was large: 51 feet long, 8 feet high, and 2 feet deep; it weighed about 9,445 pounds  The basic calculating units had to be synchronized and powered mechanically, so they were operated by a 50-foot (15 m) drive shaft coupled to a 5 horsepower electric motor, which served as the main power source and system clock. It could do 3 additions or subtractions in a second; a multiplication took 6 seconds; a division took 15.3 seconds; and a logarithm or a trigonometric function took over a minute… ridiculously slow by today’s standards, but a huge advance in its time.

source

“The tribalizing power of the new electronic media, the way in which they return to us to the unified fields of the old oral cultures, to tribal cohesion and pre-individualist patterns of thought, is little understood”*…

Nokia was dominant in mobile phone sales from 1998 to around 2010. Nokia’s slogan: Connecting people.

It was amazing to connect with people in the late 90s/early 2000s. I don’t think we were lonely exactly. But maybe meeting people was somewhere between an opportunity, something novel, and, yes, a need – suddenly it was possible to find the right person, or the right community.

So, the zeitgeist of the early 2000s.

I ran across a previous zeitgeist in an article about Choose Your Own Adventure books. They appeared and became massively popular at the same time as text adventure computer games, but neither inspired the invention of the other. How? The real answer may lie far deeper in the cultural subconscious … in the zeitgeist of the 1980s.

1980s: you.

2000s: connection.

2020s: ?

Zeitgeists don’t lead and zeitgeists don’t follow.

I think when we spot some kind of macro trend in establishment consumer ads, it’s never going to be about presenting people with something entirely new. To resonate, it has to be familiar – the trajectory that the consumer is already on – but it also has to scratch an itch. The brand wants to be a helpful fellow traveller, if you like.

I wonder what the zeitgeist of the 2020s will be, or is already maybe. What deep human need will be simultaneously a comfort and an aspiration? There should be hints of it in popular culture already. (If I knew how to put my finger on it, I’d be an ad planner.)

If I had to guess then it would be something about belonging.

There was a hint of this in Reddit’s 5 second Super Bowl commercial which went hard on one their communities, r/WallStreetBets, ganging up to bring down hedge funds. Then we’ve got a couple of generations now who grew up with the idea of fandoms, and of course conspiracy theories like QAnon too. If you squint, you can kind of see this in the way Tesla operates: it’s a consumer brand but it’s also a passionate, combative cause.

Belonging to a tribe is about identity and strength, it’s solace and empowerment all at once. And also knowledge, certainty, and trust in an era of complexity, disinfo, and hidden agendas.

Given that backdrop, it’s maybe unsurprising that the trend in software is towards Discord servers and other virtual private neighbourhoods. But how else will this appear? And is it just the beginnings of something else, something bigger?

1980s (you), 2000s (connection). What’s the 2020s zeitgeist?” From Matt Webb (@genmon)

* Marshall McLuhan

###

As we double down on diversity, we might send well-connected birthday greetings to Joseph Carl Robnett Licklider; he was born on this date in 1015. Better known as “J.C,R.” or “Lick,” he was a prominent figure in the development of computing and computer science. He was especially impactful Considered the “Johnny Appleseed” of computing, he planted many of the seeds of computing in the digital age– escpecially via his idea of a universal computer network to easily transfer and retrieve information which his successors developed into the internet.

Robert Taylor, founder of Xerox PARC‘s Computer Science Laboratory and Digital Equipment Corporation‘s Systems Research Center, noted that “most of the significant advances in computer technology—including the work that my group did at Xerox PARC—were simply extrapolations of Lick’s vision. They were not really new visions of their own. So he was really the father of it all.”

source

Written by (Roughly) Daily

March 11, 2021 at 1:01 am

%d bloggers like this: