(Roughly) Daily

Posts Tagged ‘copyright

“Nothing is so painful to the human mind as a great and sudden change”*…

If an AI-infused web is the future, what can we learn from the past? Jeff Jarvis has some provocative thoughts…

The Gutenberg Parenthesis—the theory that inspired my book of the same name—holds that the era of print was a grand exception in the course of history. I ask what lessons we may learn from society’s development of print culture as we leave it for what follows the connected age of networks, data, and intelligent machines—and as we negotiate the fate of such institutions as copyright, the author, and mass media as they are challenged by developments such as generative AI. 

Let’s start from the beginning…

In examining the half-millennium of print’s history, three moments in time struck me: 

  • After Johannes Gutenberg’s development of movable type in the 1450s in Europe (separate from its prior invention in China and Korea), it took a half-century for the book as we now know it to evolve out of its scribal roots—with titles, title pages, and page numbers. It took another century, until this side and that of 1600, before there arose tremendous innovation with print: the invention of the modern novel with Cervantes, the essay with Montaigne, a market for printed plays with Shakespeare, and the newspaper.
  • It took another century before a business model for print at last emerged with copyright, which was enacted in Britain in 1710, not to protect authors but instead to transform literary works into tradable assets, primarily for the benefit of the still-developing industry of publishing. 
  • And it was one more century—after 1800—before major changes came to the technology of print: the steel press, stereotyping (to mold complete pages rather than resetting type with every edition), steam-powered presses, paper made from abundant wood pulp instead of scarce rags, and eventually the marvelous Linotype, eliminating the job of the typesetter. Before the mechanization and industrialization of print, the average circulation of a daily newspaper in America was 4,000 (the size of a healthy Substack newsletter these days). Afterwards, mass media, the mass market, and the idea of the mass were born alongside the advertising to support them. 

One lesson in this timeline is that the change we experience today, which we think is moving fast, is likely only the beginning. We are only a quarter century past the introduction of the commercial web browser, which puts us at about 1480 in Gutenberg years. There could be much disruption and invention still ahead. Another lesson is that many of the institutions we assume are immutable—copyright, the concept of creativity as property, mass media and its scale, advertising and the attention economy—are not forever. That is to say that we can reconsider, reinvent, reject, or replace them as need and opportunity present…

Read on for his suggestion for a reinvention of copyright: “Gutenberg’s lessons in the era of AI,” from @jeffjarvis via @azeem in his valuable newsletter @ExponentialView.

* Mary Wollstonecraft Shelley, Frankenstein

###

As we contemplate change, we might spare a thought for Jan Hus. A  Czech theologian and philosopher who became a Church reformer, he was burned at the stake as a heretic (for condemning indulgences and the Crusades) on this date in 1415. His teachings (which largely echoed those of Wycliffe) had a strong influence, over a century later, on Martin Luther, helping inspire the Reformation… which was fueled by Gutenberg’s technology, which had been developed and begun to spread in the meantime.

Jan Hus at the stake, Jena codex (c. 1500) source

“I know that there are people who do not love their fellow man, and I hate people like that!”*…

After this post, your correspondent begins his annual Holiday Hiatus, so let me pass along a gift. Regular service will resume on January 2; meantime, thanks to you all for reading– and Happy Holidays!

Ladies and Gentlemen, the remarkable Tom Lehrer

Between the… Weird Al biopic and and Kelly Bachman and Dylan Adler’s new comedy album, Rape Victims Are Horny Too, parody songsters have rarely been such a central part of the cultural zeitgeist. Yet comedy musicians have been trailblazing for much longer than we may realize, and before the popularity of Weird Al, or even Bo Burnham, Adam Sandler, and Steve Martin, there was Tom Lehrer.

Lehrer is perhaps best known for “The Elements,” an impossibly fast and impossibly tongue-twisty song that consists of all the periodic elements put to the tune of a “Major-General’s Song” from Pirates of Penzance. This is a classic Tom Lehrer song, so classic in fact that Daniel Radcliffe’s rendition of it on The Graham Norton Show helped score him the role of Weird Al, who has noted Lehrer as a major comedic influence. This full circle loop only emphasizes how important Lehrer was and continues to be in the musical comedy scene, and diving into his diverse oeuvre solidifies just how innovative he was…

Tom Lehrer Deserves Your Attention” (source of the photo above)

And it’s now easier than ever for you to pay him that attention: the remarkable Mr. Lehrer is also remarkably generous. Last month, a new web site appeared, the home page of which explains…

I, Tom Lehrer, and the Tom Lehrer Trust 2007, hereby grant the following permissions:

All copyrights to lyrics or music written or composed by me have been relinquished, and therefore such songs are now in the public domain. All of my songs that have never been copyrighted, having been available for free for so long, are now also in the public domain.

The latter includes all lyrics which I have written to music by others, although the music to such parodies, if copyrighted by their composers, are of course not included without permission of their copyright owners. The translated songs on this website may be found on YouTube in their original languages.

Performing and recording rights to all of my songs are included in this permission. Translation rights are also included….

The site contains all of his lyrics, and free streaming and downloadable versions of all of his albums– a satirical gold mine: “Songs and Lyrics by Tom Lehrer.”

* Tom Lehrer

###

As we soak ourselves in satire, we might recall that it was on this date in 1986 that Fran Oz’s film Little Shop of Horrors premiered. Based on the Off-Broadway musical adaptation of the 1960 original Roger Corman movie, it was a huge success with critics and a modest success with audiences on release, but has since become cult film.

As Oz’s friend and Muppet colleague Jim Henson said, “the lip sync on the plant in that film is just absolutely amazing.”

source

Written by (Roughly) Daily

December 19, 2022 at 1:00 am

“There are only two different types of companies in the world: those that have been breached and know it and those that have been breached and don’t know it.”*…

Enrique Mendoza Tincopa (and here) with a visualization of what’s on offer on the dark web and what it costs…

Did you know that the internet you’re familiar with is only 10% of the total data that makes up the World Wide Web?

The rest of the web is hidden from plain sight, and requires special access to view. It’s known as the Deep Web, and nestled far down in the depths of it is a dark, sometimes dangerous place, known as the darknet, or Dark Web

Visual Capitalist

For a larger version, click here

And for a look at the research that underlies the graphic, click here.

What’s your personal information worth? “The Dark Web Price Index 2022,” from @DatavizAdventuR via @VisualCap.

(Image at top: source)

Ted Schlein

###

As we harden our defenses, we might recall that it was on this date in 1994 that arguments began in the case of United States vs. David LaMacchia, in which David LaMacchia stood accused of Conspiracy to Commit Wire Fraud. He had allegedly operated the “Cynosure” bulletin board system (BBS) for six weeks, to hosting pirated software on Massachusetts Institute of Technology (MIT) servers. Federal prosecutors didn’t directly charge LaMacchia with violating copyright statutes; rather they chose to charge him under a federal wire fraud statute that had been enacted in 1952 to prevent the use of telephone systems for interstate fraud. But the court ruled that as he had no commercial motive (he was not charging for the shared software), copyright violation could not be prosecuted under the wire fraud statute; LaMacchia was found not guilty– giving rise to what became known as “the LaMacchia loophole”… and spurring legislative action to try to close that gap.

Background documents from the case are here.

The MIT student paper, covering the case (source)

Written by (Roughly) Daily

November 18, 2022 at 1:00 am

“Beware of him who would deny you access to information, for in his heart he dreams himself your master”*…

NOAA/Plotting the position of the survey ship PATHFINDER, Alaska

Stewart Brand once suggested that “Information wants to be free. Information also wants to be expensive. …That tension will not go away.” Indeed, it seems to be growing…

Aaron Swartz was 26 years old when he took his own life. He did so under the shadow of legal prosecution, pursued by government lawyers intent on maximal punishment. If found guilty, he potentially faced up to 50 years in prison and a $1 million dollar fine. Swartz’s crime was not only legal, but political. He had accessed a private computer network and gained possession of highly valuable information with the goal of sharing it. His actions threatened some of the most powerful, connected, and politically protected groups in the country. Their friends in the government were intent on sending a message.

It’s the kind of story you would expect about some far-off political dissident. But Swartz took his life in Brooklyn on a winter day in 2013 and his prosecutor was the U.S. federal government. When Swartz died, he was under indictment for 13 felony charges related to his use of an MIT computer to download too many scientific articles from the academic database JSTOR, ostensibly for the purpose of making them freely available to the public. Ultimately, Swartz potentially faced more jail time for downloading academic papers than he would have if he had helped Al Qaeda build a nuclear weapon. Even the Criminal Code of the USSR stipulated that those who stored and distributed anti-Soviet literature only faced five to seven years in prison. While prosecutors later pointed toward a potential deal for less time, Aaron would still have been labeled a felon for his actions—and to boot, JSTOR itself had reached a civil settlement and didn’t even pursue its own lawsuit.

But Aaron’s cause lived on. This September marks the ten-year anniversary of Sci-Hub, the online “shadow library” that provides access to millions of research papers otherwise hidden behind prohibitive paywalls. Founded by the Kazakhstani computer scientist Alexandra Elbakyan—popularly known as science’s “pirate queen”—Sci-Hub has grown to become a repository of over 85 million academic papers.

The site is popular globally, used by millions of people—many of whom would otherwise not be able to finish their degrees, advise their patients, or use text mining algorithms to make new scientific discoveries. Sci-Hub has become the unacknowledged foundation that helps the whole enterprise of academia to function. 

Even when they do not need to use Sci-Hub, the superior user experience it offers means that many people prefer to use the illegal site rather than access papers through their own institutional libraries. It is difficult to say how many ideas, grants, publications, and companies have been made possible by Sci-Hub, but it seems undeniable that Elbakyan’s ten-year-old website has become a crucial component of contemporary scholarship.  

The success of Sci-Hub has made it a target for injunctions and investigations. Academic publishers have sued Sci-Hub repeatedly, opponents have accused the site’s creators of collaborating with Russian intelligence, and private sector critics have dubbed it a threat to “national security.” Elbakyan recently tweeted out a notification she received that the FBI had requested her personal data from Apple. 

Whatever happens to Sci-Hub or Elbakyan, the fact that such a site exists is something of a tragedy. Sci-Hub currently fills a niche that should never have existed. Like the black-market medicine purchased by people who cannot afford prescription drugs, its very being indicts the official system that created the conditions of its emergence… 

The cost of individually purchasing all the articles required to complete a typical literature review could easily amount to thousands of dollars. Beyond paying for the articles themselves, academics often have to pay steep fees to publish their research. Meanwhile, most peer reviewers and editors charged with assessing, correcting, and formatting papers do not receive compensation for their work. 

It’s particularly ironic that this situation exists alongside a purported digital “infodemic” of misinformation. The costs of this plague are massive, from opposition to the pandemic response to the conspiracy theories that drove a loving father to fire his gun inside a pizza parlor and a man to kill a mafia boss accused of having ties to the deep state. But few public figures, if any, draw the direct connection between the expensive barricades around scientific research and the conspicuous epistemic collapse of significant portions of the American political discourse…

Whether intended or not, the impossibly high paywalls of academic publishers only serve to keep scientific information out of the population’s hands. What makes this even more discordant is that the people being prevented from accessing the information are often also the taxpayers who helped fund the research in the first place. 

By framing the debate about Sci-Hub as one concerning property rights, both advocates of Elbakyan’s site and its detractors fall afoul of what John Gall called the “operational fallacy.” In his book The Systems Bible, Gall defined the operational fallacy as a situation where “the system itself does not do what it says it is doing.” In other words, what a system calls itself is not always a reliable indicator of its true function. In this case, the name of the “academic publishing industry” implies that it is supposed to be involved in the dissemination of scholarship. But the effective function of the academic publishing industry as it actually exists is to prevent the dissemination of scholarly work. 

Given the example of Sci-Hub, the easy logistics of internet publication, and the funding structure of academic research, it seems clear that in the absence of the academic publishing industry, scholarship would be more widely available, not less. If the academic publishing industry did not exist, scientists could still do their research—in fact, it would be easier to do so as more people would have access to scholarly literature. The peer-review process could still function normally—though there are good reasons to change that as well. And the resulting papers could simply be posted in a place where anyone could read them. 

When we explore the actual function of the academic publishing industry—restricting access to scholarly research—we see that these publishers have little in common with the companies that have opposed other file-sharing sites. When several record companies sued Napster in 2001, they could make the legitimate case that the economic well-being of the musicians, producers, and all the people who were needed to distribute recorded music was at stake. No such parallel exists in the case of Sci-Hub. Scientists are not paid by the publishers. Peer reviewers are not paid by the publishers. Distribution itself, as proven by Sci-Hub and its more law-abiding brother arXiv, is cheap enough to be provided to the public for free. It’s not surprising, then, that polls reveal that scientists overwhelmingly support Sci-Hub…  

Eminently worth reading in full– the civic tragedy of academic publishing: “A World Without Sci-Hub,” from Jason Rhys Perry (@JRhysParry) in @palladiummag.

Sid Meier

###

As we share and share alike, we might recall that it was on this date in 1970 that the Public Broadcasting Service– PBS– premiered, when it took over (most of) the functions of its predecessor, National Educational Television.

Unlike the five major commercial broadcast television networks in the United States (ABC, CBS, NBC, Fox, and The CW) PBS is technically not a network, but rather a program distributor that provides television content and related services to its member stations. Each station sets its own schedule and programs local content (e.g., local/state news, interviews, cultural, and public affairs programs) that supplements content provided by PBS and other public television distributors.

source

“Doing research on the Web is like using a library assembled piecemeal by pack rats and vandalized nightly”*…

But surely, argues Jonathan Zittrain, it shouldn’t be that way…

Sixty years ago the futurist Arthur C. Clarke observed that any sufficiently advanced technology is indistinguishable from magic. The internet—how we both communicate with one another and together preserve the intellectual products of human civilization—fits Clarke’s observation well. In Steve Jobs’s words, “it just works,” as readily as clicking, tapping, or speaking. And every bit as much aligned with the vicissitudes of magic, when the internet doesn’t work, the reasons are typically so arcane that explanations for it are about as useful as trying to pick apart a failed spell.

Underpinning our vast and simple-seeming digital networks are technologies that, if they hadn’t already been invented, probably wouldn’t unfold the same way again. They are artifacts of a very particular circumstance, and it’s unlikely that in an alternate timeline they would have been designed the same way.

The internet’s distinct architecture arose from a distinct constraint and a distinct freedom: First, its academically minded designers didn’t have or expect to raise massive amounts of capital to build the network; and second, they didn’t want or expect to make money from their invention.

The internet’s framers thus had no money to simply roll out a uniform centralized network the way that, for example, FedEx metabolized a capital outlay of tens of millions of dollars to deploy liveried planes, trucks, people, and drop-off boxes, creating a single point-to-point delivery system. Instead, they settled on the equivalent of rules for how to bolt existing networks together.

Rather than a single centralized network modeled after the legacy telephone system, operated by a government or a few massive utilities, the internet was designed to allow any device anywhere to interoperate with any other device, allowing any provider able to bring whatever networking capacity it had to the growing party. And because the network’s creators did not mean to monetize, much less monopolize, any of it, the key was for desirable content to be provided naturally by the network’s users, some of whom would act as content producers or hosts, setting up watering holes for others to frequent.

Unlike the briefly ascendant proprietary networks such as CompuServe, AOL, and Prodigy, content and network would be separated. Indeed, the internet had and has no main menu, no CEO, no public stock offering, no formal organization at all. There are only engineers who meet every so often to refine its suggested communications protocols that hardware and software makers, and network builders, are then free to take up as they please.

So the internet was a recipe for mortar, with an invitation for anyone, and everyone, to bring their own bricks. Tim Berners-Lee took up the invite and invented the protocols for the World Wide Web, an application to run on the internet. If your computer spoke “web” by running a browser, then it could speak with servers that also spoke web, naturally enough known as websites. Pages on sites could contain links to all sorts of things that would, by definition, be but a click away, and might in practice be found at servers anywhere else in the world, hosted by people or organizations not only not affiliated with the linking webpage, but entirely unaware of its existence. And webpages themselves might be assembled from multiple sources before they displayed as a single unit, facilitating the rise of ad networks that could be called on by websites to insert surveillance beacons and ads on the fly, as pages were pulled together at the moment someone sought to view them.

And like the internet’s own designers, Berners-Lee gave away his protocols to the world for free—enabling a design that omitted any form of centralized management or control, since there was no usage to track by a World Wide Web, Inc., for the purposes of billing. The web, like the internet, is a collective hallucination, a set of independent efforts united by common technological protocols to appear as a seamless, magical whole.

This absence of central control, or even easy central monitoring, has long been celebrated as an instrument of grassroots democracy and freedom. It’s not trivial to censor a network as organic and decentralized as the internet. But more recently, these features have been understood to facilitate vectors for individual harassment and societal destabilization, with no easy gating points through which to remove or label malicious work not under the umbrellas of the major social-media platforms, or to quickly identify their sources. While both assessments have power to them, they each gloss over a key feature of the distributed web and internet: Their designs naturally create gaps of responsibility for maintaining valuable content that others rely on. Links work seamlessly until they don’t. And as tangible counterparts to online work fade, these gaps represent actual holes in humanity’s knowledge…

The glue that holds humanity’s knowledge together is coming undone: “The Internet Is Rotting.” @zittrain explains what we can do to heal it.

(Your correspondent seconds his call to support the critically-important work of The Internet Archive and the Harvard Library Innovation Lab, along with the other initiatives he outlines.)

* Roger Ebert

###

As we protect our past for the future, we might recall that it was on this date in 1937 that Hormel introduced Spam. It was the company’s attempt to increase sales of pork shoulder, not at the time a very popular cut. While there are numerous speculations as to the “meaning of the name” (from a contraction of “spiced ham” to “Scientifically Processed Animal Matter”), its true genesis is known to only a small circle of former Hormel Foods executives.

As a result of the difficulty of delivering fresh meat to the front during World War II, Spam became a ubiquitous part of the U.S. soldier’s diet. It became variously referred to as “ham that didn’t pass its physical,” “meatloaf without basic training,” and “Special Army Meat.” Over 150 million pounds of Spam were purchased by the military before the war’s end. During the war and the occupations that followed, Spam was introduced into Guam, Hawaii, Okinawa, the Philippines, and other islands in the Pacific. Immediately absorbed into native diets, it has become a unique part of the history and effects of U.S. influence in the Pacific islands.

source

%d bloggers like this: