(Roughly) Daily

Posts Tagged ‘social media

“In our world of big names, curiously, our true heroes tend to be anonymous”*…

Pattie Maes was inventing the core principles behind the social media age when Mark Zuckerberg was still in kindergarten, but her contributions been largely unrecognized. Steven Johnson explains…

Anyone who was around for the early days of the World Wide Web, before the Netscape IPO and the dotcom boom, knows that there was a strange quality to the medium back then – in many ways the exact opposite of the way the Web works today. It was oddly devoid of people. Tim Berners-Lee had conjured up a radically new way of organizing information through the core innovations of hypertext and URLs, which created a standardized way of pointing to the location of documents. But almost every Web page you found yourself on back in those frontier days was frozen in the form that its author had originally intended. The words on the screen couldn’t adapt to your presence and your interests as you browsed. Interacting with other humans and having conversations – all that was still what you did with email or USENET or dial-up bulletin boards like The Well. The original Web was more like a magic library, filled with pages that could connect to other pages through miraculous wormholes of links. But the pages themselves were fixed, and everyone browsed alone.

One of the first signs that the Web might eventually escape those confines arrived in the last months of 1994, with the release of an intriguing (albeit bare-bones) prototype called HOMR, short for the Helpful Online Music Recommendation service.

HOMR was one of a number of related projects that emerged in the early-to-mid-90s out of the MIT lab of the Belgian-born computer scientist Pattie Maes, projects that eventually culminated in a company that Maes co-founded, called Firefly. HOMR pulled off a trick that was genuinely unprecedented at the time: it could make surprisingly sophisticated recommendations of music that you might like. It seemed to be capable of learning something about you as an individual. Unlike just about everything else on the Web back then, HOMR’s pages were not one-size-fits all. They suggested, perhaps for the first time, that this medium was capable of conveying personalized information. Firefly would then take that advance to the next level: not just recommending music, but actually connecting you to other people who shared your tastes.

Maes called the underlying approach “collaborative filtering”, but looking back on it with more than two decades’ worth of hindsight, it’s clear that what we were experiencing with HOMR and Firefly was the very beginnings of a new kind of software platform that would change the world in the coming decades, for better and for worse: social networks…

Read on at “Intelligent Agent: How Pattie Maes almost invented social media,” from @stevenbjohnson, the first in a new series, “Hidden Heroes.”

* Daniel J. Boorstin

###

As we give credit where credit is due, we might recall that it was on this date in 1960 that the FDA approved the first birth control pill– an oral medication for use by women as a contraceptive. In 1953, birth control crusader Margaret Sanger and her supporter/financier, philanthropist Katharine Dexter McCormick had given Dr. Gregory Pincus $150,000 to continue his prior research and develop a safe and effective oral contraceptive for women.

In just five years, almost half of married women on birth control were using it.

But the real revolution would come when unmarried women got access to oral contraceptives. That took time. But in around 1970 – 10 years after the pill was first approved – US state after US state started to make it easier for single women to get the pill…

And that was when the economic revolution really began.

Women in America started studying particular kinds of degrees – law, medicine, dentistry and MBAs – which had previously been very masculine.

In 1970, medical degrees were over 90% male. Law degrees and MBAs were over 95% male. Dentistry degrees were 99% male. But at the beginning of the 1970s – equipped with the pill – women surged into all these courses. At first, women made up a fifth of the class, then a quarter. By 1980 they often made up a third…

The tiny pill which gave birth to an economic revolution,” by Tim Harford, in the BBC’s series 50 Things That Made the Modern Economy

source

“You say you’re a pessimist, but I happen to know that you’re in the habit of practicing your flute for two hours every evening”*…

The Harrowing of Hell, Hieronymus Bosch

A couple of weeks ago, (R)D featured a piece by Jonathan Haidt, “Why the Past 10 Years of American Life Have Been Uniquely Stupid,” in which Haidt critiqued, among others, Robert Wright and his influential book, Non-Zero. In the spirit of George Bernard Shaw (who observed: “Both optimists and pessimists contribute to society. The optimist invents the aeroplane, the pessimist the parachute.“) Wright responds…

… There are three main culprits in Haidt’s story, three things that have torn our world asunder: the like button, the share button (or, on Twitter, the retweet button), and the algorithms that feed on those buttons. “Babel is a metaphor for what some forms of social media have done to nearly all of the groups and institutions most important to the country’s future—and to us as a people.”

I would seem uniquely positioned to cheer us up by taking issue with Haidt’s depressing diagnosis. Near the beginning of his piece, he depicts my turn-of-the-millennium book Nonzero: The Logic of Human Destiny as in some ways the antithesis of his thesis—as sketching a future in which information technology unites rather than divides…

Well, two things I’m always happy to do are (1) cheer people up; and (2) defend a book I’ve written. I’d like to thank Haidt (who is actually a friend—but whom I’ll keep calling “Haidt” to lend gravitas to this essay) for providing me the opportunity to do both at once.

But don’t let your expectations get too high about the cheering people up part—because, for starters, the book I’m defending wasn’t that optimistic. I wrote in Nonzero, “While I’m basically optimistic, an extremely bleak outcome is obviously possible.” And even if we avoid a truly apocalyptic fate, I added, “several moderately bleak outcomes are possible.”

Still, looking around today, I don’t see quite as much bleakness as Haidt seems to see. And one reason, I think, is that I don’t see the causes of our current troubles as being quite as novel as he does. We’ve been here before, and humankind survived…

Read on for a brief history of humankind’s wrestling with new information technologies (e.g., writing and the printing press). Wright concludes…

In underscoring the importance of working to erode the psychology of tribalism (a challenge approachable from various angles, including one I wrote a book about), I don’t mean to detract from the value of piecemeal reforms. Haidt offers worthwhile ideas about how to make social media less virulent and how to reduce the paralyzing influence of information technology on democracy. (He spends a lot of time on the info tech and democracy issue—and, once again, I’d say he’s identified a big problem but also a longstanding problem; I wrote about it in 1995, in a Time magazine piece whose archival version is mis-dated as 2001.) The challenge we face is too big to let any good ideas go to waste, and Haidt’s piece includes some good ones.

Still, I do think that stepping back and looking at the trajectory of history lets us assess the current turmoil with less of a sense of disorientation than Haidt seems to feel. At least, that’s one takeaway from my argument in Nonzero, which chronicled how the evolution of technology, especially information technology, had propelled human social organization from the hunter-gatherer village to the brink of global community—a threshold that, I argued, we will fail to cross at our peril.

This isn’t the place to try to recapitulate that argument in compelling form. (There’s a reason I devoted a whole book to it.) So there’s no reason the argument should make sense to you right now. All I can say is that if you do ever have occasion to assess the argument, and it does make sense to you, the turbulence we’re going through will also make more sense to you.

Is Everything Falling Apart?@JonHaidt thinks so; @robertwrighter is not so sure.

Apposite: “An optimist’s guide to the future: the economist who believes that human ingenuity will save the world,” and “The Future Will Be Shaped by Optimists,” from @kevin2kelly at @TedConferences.

* Friedrich Nietzsche (criticizing Schopenhauer)

###

As we look on the bright side of life, we might send darkly-tinted birthday greetings to Oswald Spengler; he was born on this date in 1880. Best known for his two-volume work, The Decline of the West (Der Untergang des Abendlandes), published in 1918 and 1922, he was a historian and philosopher of history who developed an “organic theory” of history that suggested that human cultures and civilizations are akin to biological entities, each with a limited, predictable, and deterministic lifespan– and that around the year 2000, Western civilization would enter the period of pre‑death emergency whose countering would lead to 200 years of Caesarism (extra-constitutional omnipotence of the executive branch of government) before Western civilization’s final collapse. He was a major influence on many historians (including Arnold Toynbee and Samuel “Clash of Civilizations” Huntington).

source

“Our social tools are not an improvement to modern society, they are a challenge to it”*…

Nicolás Ortega. Source: “Turris Babel,” Coenraet Decker, 1679

Jonathan Haidt ponders the poisonous impact of social media, arguing that “It’s not just a phase,” and what considers we might do about it…

… It’s been clear for quite a while now that red America and blue America are becoming like two different countries claiming the same territory, with two different versions of the Constitution, economics, and American history. But Babel is not a story about tribalism; it’s a story about the fragmentation of everything. It’s about the shattering of all that had seemed solid, the scattering of people who had been a community. It’s a metaphor for what is happening not only between red and blue, but within the left and within the right, as well as within universities, companies, professional associations, museums, and even families.

Babel is a metaphor for what some forms of social media have done to nearly all of the groups and institutions most important to the country’s future—and to us as a people. How did this happen? And what does it portend for American life?

The high point of techno-democratic optimism was arguably 2011, a year that began with the Arab Spring and ended with the global Occupy movement. That is also when Google Translate became available on virtually all smartphones, so you could say that 2011 was the year that humanity rebuilt the Tower of Babel. We were closer than we had ever been to being “one people,” and we had effectively overcome the curse of division by language. For techno-democratic optimists, it seemed to be only the beginning of what humanity could do.

In February 2012, as he prepared to take Facebook public, Mark Zuckerberg reflected on those extraordinary times and set forth his plans. “Today, our society has reached another tipping point,” he wrote in a letter to investors. Facebook hoped “to rewire the way people spread and consume information.” By giving them “the power to share,” it would help them to “once again transform many of our core institutions and industries.”

In the 10 years since then, Zuckerberg did exactly what he said he would do. He did rewire the way we spread and consume information; he did transform our institutions, and he pushed us past the tipping point. It has not worked out as he expected…

Social media and society: “Why the Past 10 Years of American Life Have Been Uniquely Stupid,” from @JonHaidt in @TheAtlantic. Eminently worth reading in full.

See also: “The big idea: how to win the fight against disinformation.”

* Clay Shirky

###

As we follow Jaron Lanier‘s advice to “go to where you are kindest,” we might recall that it was on this date 1397 that Geoffrey Chaucer “told” (read aloud) The Canterbury Tales for the first time at the court of Richard II.

220px-Canterbury_Tales
A woodcut from William Caxton‘s second edition of The Canterbury Tales, printed in 1483

source

“Why has our age surrendered so easily to the controllers, the manipulators, the conditioners of an authoritarian technics?”*…

Half a century ago, Lewis Mumford developed a concept that explains why we trade autonomy for convenience…

… Surveying the state of the high-tech life, it is tempting to ponder how it got so bad, while simultaneously forgetting what it was that initially convinced one to hastily click “I agree” on the terms of service. Before certain social media platforms became foul-smelling swamps of conspiratorial misinformation, many of us joined them for what seemed like good reasons; before sighing at the speed with which their batteries die, smartphone owners were once awed by these devices: before grumbling that there was nothing worth watching, viewers were astounded by how much streaming content was available at one’s fingertips. Overwhelmed by the way today’s tech seems to be burying us in the bad, it’s easy to forget the extent to which tech won us over by offering us a share in the good — or to be more precise, in “the goods.” 

Nearly 50 years ago, long before smartphones and social media, the social critic Lewis Mumford put a name to the way that complex technological systems offer a share in their benefits in exchange for compliance. He called it a “bribe.” With this label, Mumford sought to acknowledge the genuine plentitude that technological systems make available to many people, while emphasizing that this is not an offer of a gift but of a deal. Surrender to the power of complex technological systems — allow them to oversee, track, quantify, guide, manipulate, grade, nudge, and surveil you — and the system will offer you back an appealing share in its spoils. What is good for the growth of the technological system is presented as also being good for the individual, and as proof of this, here is something new and shiny. Sure, that shiny new thing is keeping tabs on you (and feeding all of that information back to the larger technological system), but it also lets you do things you genuinely could not do before. For a bribe to be accepted it needs to promise something truly enticing, and Mumford, in his essay “Authoritarian and Democratic Technics,” acknowledged that “the bargain we are being asked to ratify takes the form of a magnificent bribe.” The danger, however, was that “once one opts for the system no further choice remains.” 

For Mumford, the bribe was not primarily about getting people into the habit of buying new gadgets and machines. Rather it was about incorporating people into a world that complex technological systems were remaking in their own image. Anticipating resistance, the bribe meets people not with the boot heel, but with the gift subscription.

The bribe is a discomforting concept. It asks us to consider the ways the things we purchase wind up buying us off, it asks us to see how taking that first bribe makes it easier to take the next one, and, even as it pushes us to reflect on our own complicity, it reminds us of the ways technological systems eliminate their alternatives. Writing about the bribe decades ago, Mumford was trying to sound the alarm, as he put it: “This is not a prediction of what will happen, but a warning against what may happen.” As with all of his glum predictions, it was one that Mumford hoped to be proven wrong about. Yet as one scrolls between reviews of the latest smartphone, revelations about the latest misdeeds of some massive tech company, and commentary about the way we have become so reliant on these systems that we cannot seriously speak about simply turning them off — it seems clear that what Mumford warned “may happen” has indeed happened…

Eminently worth reading in full: “The Magnificent Bribe,” by Zachary Loeb in @_reallifemag.

As to (some of) the modern implications of that bargain, see also Shoshana Zuboff‘s: “You Are the Object of a Secret Extraction Operation.”

As we move into the third decade of the 21st century, surveillance capitalism is the dominant economic institution of our time. In the absence of countervailing law, this system successfully mediates nearly every aspect of human engagement with digital information. The promise of the surveillance dividend now draws surveillance economics into the “normal” economy, from insurance, retail, banking and finance to agriculture, automobiles, education, health care and more. Today all apps and software, no matter how benign they appear, are designed to maximize data collection.

Historically, great concentrations of corporate power were associated with economic harms. But when human data are the raw material and predictions of human behavior are the product, then the harms are social rather than economic. The difficulty is that these novel harms are typically understood as separate, even unrelated, problems, which makes them impossible to solve. Instead, each new stage of harm creates the conditions for the next stage…

And resonantly: “AI-tocracy” a working paper from NBER that links the development of artificial intelligence with the interests of autocracies: from the abstract:

Can frontier innovation be sustained under autocracy? We argue that innovation and autocracy can be mutually reinforcing when: (i) the new technology bolsters the autocrat’s power; and (ii) the autocrat’s demand for the technology stimulates further innovation in applications beyond those benefiting it directly. We test for such a mutually reinforcing relationship in the context of facial recognition AI in China. To do so, we gather comprehensive data on AI firms and government procurement contracts, as well as on social unrest across China during the last decade. We first show that autocrats benefit from AI: local unrest leads to greater government procurement of facial recognition AI, and increased AI procurement suppresses subsequent unrest. We then show that AI innovation benefits from autocrats’ suppression of unrest: the contracted AI firms innovate more both for the government and commercial markets. Taken together, these results suggest the possibility of sustained AI innovation under the Chinese regime: AI innovation entrenches the regime, and the regime’s investment in AI for political control stimulates further frontier innovation.

(And, Anne Applebaum warns, “The Bad Guys Are Winning.”)

* “Why has our age surrendered so easily to the controllers, the manipulators, the conditioners of an authoritarian technics? The answer to this question is both paradoxical and ironic. Present day technics differs from that of the overtly brutal, half-baked authoritarian systems of the past in one highly favorable particular: it has accepted the basic principle of democracy, that every member of society should have a share in its goods. By progressively fulfilling this part of the democratic promise, our system has achieved a hold over the whole community that threatens to wipe out every other vestige of democracy.

The bargain we are being asked to ratify takes the form of a magnificent bribe. Under the democratic-authoritarian social contract, each member of the community may claim every material advantage, every intellectual and emotional stimulus he may desire, in quantities hardly available hitherto even for a restricted minority: food, housing, swift transportation, instantaneous communication, medical care, entertainment, education. But on one condition: that one must not merely ask for nothing that the system does not provide, but likewise agree to take everything offered, duly processed and fabricated, homogenized and equalized, in the precise quantities that the system, rather than the person, requires. Once one opts for the system no further choice remains. In a word, if one surrenders one’s life at source, authoritarian technics will give back as much of it as can be mechanically graded, quantitatively multiplied, collectively manipulated and magnified.”

– Lewis Mumford in “Authoritarian and Democratic Technics,” via @LMSacasas

###

As we untangle user agreements, we might recall that it was on this date in 1970 that Douglas Engelbart (see here, here, and here) was granted a patent (US No. 3,541,541) on the “X-Y Position Indicator for a Display System,” the world’s first prototype computer mouse– a wooden block containing the tracking apparatus, with a single button attached.

source

source

“Curiosity has its own reason for existence”*…

This is, as nearly as I can tell, the 5,000th (Roughly) Daily post (4,505 blog posts, preceded by 495 email-only pieces). On this numerologically-significant occasion, my deep thanks to readers past and present. It seems appropriate to devote this post to the impulse that has powered (Roughly) Daily from the start, curiosity– free-range curiosity…

Recently I read a terrific blog post by CJ Eller where he talks about the value of paying attention to offbeat things.

Eller was joining an online conversation about how people get caught up in the “status and celebrity game” when they’re trying to grow their audience. They become overly obsessed with following — and emulating, and envying— the content of people with massive audiences. The conversation started with this poignant essay by the author Ali Montag; she concludes that rabidly chasing followers endows your writing (and thinking!) with “inescapable mediocrity.” (It also tends to make you miserable, too, she points out)…

Instead of crowding your attention with what’s already going viral on the intertubes, focus on the weird stuff. Hunt down the idiosyncratic posts and videos that people are publishing, oftentimes to tiny and niche audiences. It’s decidedly unviral culture — but it’s more likely to plant in your mind the seed of a rare, new idea.

I love the idea of “rewilding your attention”. It puts a name on something I’ve been trying to do for a while now: To stop clicking on the stuff big-tech algorithms push at me… social behavior can influence our attention: What are the high-follower-count folks talking/posting/arguing about today? This isn’t always a bad thing. We’re social animals, so we’re necessarily (and often productively) intrigued by what others are chewing over. But as these three writers note, it’s also crucial to follow your own signal — to cultivate the stuff you’re obsessed with, even if few others are.

On top of the social pressure from people online, there’s technological pressure too — from recommendation systems trying to juke our attention… Medium’s algorithm has deduced that … I’m a nerd. They are correct! I am. The other major social networks, like Twitter or YouTube, offer me the same geek-heavy recommendations when I log in. And hey, they’re not wrong either; I really do like these subjects.

But … I’m also interested in so many other things that are far outside these narrow lanes. I am, for example, a Canadian who’s deeply into Canadian art, and a musician who spends a lot of time thinking about composition and gear and lyric-writing and production and guitar pedals, and a father who thinks a lot about the culture my kids show me, and I have a super-snobby fanboy love of the 18th century poet Alexander Pope.

You’re the same way; you contain your own Whitmanian multitudes, your pockets of woolly-eyed obsession. We all do.

But our truly quirky dimensions are never really grasped by these recommendation algorithms. They have all the dullness of a Demographics 101 curriculum; they sketch our personalities with the crudity of crime-scene chalk-outlines. They’re not wrong about us; but they’re woefully incomplete. This is why I always get a slightly flattened feeling when I behold my feed, robotically unloading boxes of content from the same monotonous conveyor-belt of recommendations, catered to some imaginary marketing version of my identity. It’s like checking my reflection in the mirror and seeing stock-photo imagery.

The other problem with big-tech recommendation systems is they’re designed by people who are convinced that “popularity” and “recency” equal “valuable”. They figure that if they sample the last 15 milliseconds of the global zeitgeist and identify what’s floated to the top of that quantum foam, I’ll care about it. Hey, a thing happened and people are talking about it, here’s the #hashtag!

And again … they’re sometimes right! I am often intrigued to know the big debates of the day, like Oscar Wilde peering into his daily gazette. But I’d also like to stumble over arguments yet more arcane, and material that will never be the subject of a massive online conversation because only a small group of oddballs care about it.

You’re the same way too, I bet. We’re all weird in different ways, but we’re all weird.

Big-tech recommendation systems have been critiqued lately for their manifold sins— i.e. how their remorseless lust for “engagement” leads them to overpromote hotly emotional posts; how they rile people up; how they feed us clicktastic disinfo; how they facilitate “doomscrolling”. All true.

But they pose a subtler challenge, too, for our imaginative lives: their remarkably dull conception of what’s interesting. It’s like intellectual monocropping. You open your algorithmic feed and see rows and rows of neatly planted corn, and nothing else.

That’s why I so enjoy the concept of “rewilding”… For me, it’s meant slowly — over the last few years — building up a big, rangy collection of RSS feeds, that let me check up on hundreds of electic blogs and publications and people. (I use Feedly.) I’ve also started using Fraidycat, a niftily quixotic feed-reader that lets you sort sources into buckets by “how often should I check this source”, which is a cool heuristic; some people/sites you want to check every day, and others, twice a year.

Other times I spend an hour or two simply prospecting — I pick a subject almost at random, then check to see if there’s a hobbyist or interest-group discussion-board devoted to it. (There usually is, running on free warez like phpBB). Then I’ll just trawl through the forum, to find out what does this community care about? It’s like a psychogeographic walk of the mind.

Another awesome technology for rewilding my attention, I’ve found, is the good old-fashioned paper book. I go to a bookstore, pick up something where it’s not immediately obvious why it’d appeal to me, then flip around to see if anything catches my eye. (This works online, too, via the wonderful universe of pre-1923, freely-accessible ebooks and publications at the Internet Archive, Project Gutenberg, or even Google Books. Pre-WWI material is often super odd and thought-provoking.)…

Step away from algorithmic feeds. In praise of free-range curiosity: “Rewilding your attention,” from Clive Thompson (@pomeranian99).

See also “Before Truth: Curiosity, Negative Capability, Humility, ” from Will Wilkinson (@willwilkinson)

* “Curiosity has its own reason for existence. One cannot help but be in awe when he contemplates the mysteries of eternity, of life, of the marvelous structure of reality. It is enough if one tries merely to comprehend a little of this mystery each day.” — Albert Einstein, “Old Man’s Advice to Youth: ‘Never Lose a Holy Curiosity.'” LIFE Magazine (2 May 1955) p. 64”

###

As we revel in rabbit holes, we might send insightfully-humorous birthday greetings to William Penn Adair Rogers; he was born on this date in 1879.  A stage and motion picture actor, vaudeville performer, cowboy, humorist, newspaper columnist, and social commentator, he traveled around the world three times, made 71 films (50 silent films and 21 “talkies”), and wrote more than 4,000 nationally syndicated newspaper columns.  By the mid-1930s Rogers was hugely popular in the United States, its leading political wit and the highest paid Hollywood film star.  He died in 1935 with aviator Wiley Post when their small airplane crashed in northern Alaska.

Known as “Oklahoma’s Favorite Son,” Rogers was a Cherokee citizen, born to a Cherokee family in Indian Territory (now part of Oklahoma).

“I am not a member of an organized political party. I am a Democrat.”- Will Rogers

220px-Will_Rogers_1922

source

%d bloggers like this: