(Roughly) Daily

Posts Tagged ‘misinformation

“The tale is old as the Eden Tree – as new as the new-cut tooth – For each man knows ere his lip-thatch grows he is master of Art and Truth.”*…

A collage of historical documents and texts, including a colorful illuminated manuscript, a page from the original 'Hamlet' by William Shakespeare, and various handwritten notes, set against a blue background.

Mendacious politicians, duplicitous corporations, AI slop– it’s getting harder and harder to find authenticity, to get to the truth. Further to our occasional posts on misinformation in history, a look at Johns Hopkins University’s Bibliotheca Fictiva Collection of Literary and Historical Forgery, a tangible demonstration that humans have been creating fan fiction and fake news for millennia…

In “The History of Fake News From the Flood to the Apocalypse,” the course Earle Havens [see here] teaches at Johns Hopkins University, he presents undergrads with a formidable challenge. They have to create historical forgeries and then defend the authenticity of their deceptions.

Forgeries, hoaxes, and other types of literary fakery have preoccupied Havens, a rare books and manuscripts curator at the university’s Stern Center for the History of the Book, for many years now. As part of his curatorial brief, Havens oversees the Bibliotheca Fictiva Collection of Literary and Historical Forgery, available via JSTOR. It includes more than 2,000 items—rare books, manuscripts, and ephemera—and was the brainchild of Arthur and Janet Freeman, who amassed most of its holdings over a period of some fifty years. Johns Hopkins acquired the majority of the collection from the Freemans in 2011; it has continued to expand in the years since…

Sara Ivry interviews Havens: “Enchanting Imposters,” from @jstordaily.bsky.social and @saraivry.bsky.social.

* Rudyard Kipling “The Conundrum of the Workshops” (quoted by Orson Welles in his remarkable film F for Fake)

###

As we grab for a grain of salt, we might recall that it was on this date in 1964 that the Rolling Stones made their first appearance on The Ed Sullivan Show performing Chuck Berry’s “Around and Around” (and closing the show with “Time Is On My Side”).

The band’s appearance on the show generated over a million dollars in ticket sales for their fall concert tour, and despite outrage from conservative adults who disapproved of the Stones’ “unkempt” image, the group returned to The Ed Sullivan Show for another six appearances throughout the rest of the 1960s. – source

Written by (Roughly) Daily

October 25, 2025 at 1:00 am

“Truth does not do so much good in the world, as the appearance of it does evil”*…

Joshua Benton on the way that we handle misinformation as elections stakes rise…

… Giving someone a meaningful incentive on a mental problem can lead them to work harder and have a better chance of getting it right. That’s also true for a very specific kind of mental problem: figuring out whether to believe some random headline you see on social media….

[Benton cites several studies that conform this generalization…]

There’s a consistent thread here: If people don’t see a reason to bring their full mental capacity to bear on a question, they probably won’t. We’re lazy! But when the stakes are a little higher — when there’s a little more reason to bring our A-game — we can do better.

Let’s transfer that idea into politics. After all, there’s usually no direct reward for sussing out a fake headline in your News Feed, or for detecting when a claim about a politician edges from plausible to laughable. In day-to-day life, a single bit of political wrongness is unlikely to impact your life one whit. So why summon up the brain power?

But what if the stakes were suddenly higher — say, just hypothetically, if it was a presidential election season and the country is being presented with two wildly different potential futures? Would people then summon up more of their mental capacity to separate good information from bad? Pundits have long said most voters onlyget serious” about an election a few weeks before the big day — maybe that new seriousness might mean a stricter adherence to the facts?

That’s one of the issues addressed by a new paper by Charles Angelucci, Michel Gutmann, and Andrea Prat — of MIT, Northwestern, and Columbia, respectively. Its title is “Beliefs About Political News in the Run-up to an Election“; here’s the abstract, emphasis mine:

This paper develops a model of news discernment to explore the influence of elections on the formation of partisan-driven parallel information universes. Using survey data from news quizzes administered during and outside the 2020 U.S. presidential election, the model shows that partisan congruence’s impact on news discernment is substantially amplified during election periods. Outside an election, when faced with a true and a fake news story and asked to select the most likely true story, an individual is 4% more likely to choose the true story if it favors their party; in the days prior to the election, this increases to 11%.

Did you catch that? People aren’t more likely to evaluate accuracy correctly during the fever pitch of an election season — they’re less likely, and by a meaningful margin…

[Benton explains the methodology of the study and explores some examples of it more specific findings…]

… In a sense, it all comes down to what you mean by “high stakes.” Yes, a presidential election is high stakes for the country at large. But believing something that supports your ideological priors is high stakes for your ego — especially at the height of an all-consuming campaign. Our brains want to believe the best about our side and the worst about the other. And it seems that overrides any extra incentive for accuracy at the moment our votes matter most…

Are people more likely to accurately evaluate misinformation when the political stakes are high? Haha, no,” from @jbenton and @NiemanLab.

* François de la Rochefoucauld, Maximes

###

As we think, we might spare a thought for Andrew Russell “Drew” Pearson; he died on this date in 1969. A journalist known for his long-running column, “Washington Merry-Go-Round.”

At the time of Pearson’s death (of a heart attack) in Washington, D.C., the column was syndicated to more than 650 newspapers, more than twice as many as any other, with an estimated 60 million readers, who devoured the investigative and “insider”-centered approach to political coverage that Pearson pioneered– and that has become the milieu for the misinformation discussed above. A Harris Poll commissioned by Time magazine at that time showed that Pearson was America’s best-known newspaper columnist. The column was continued by Jack Anderson and then by Douglas Cohn and Eleanor Clift, who combine commentary with historical perspectives. It is the longest-running syndicated column in America.

Pearson (left) with Lyndon Johnson (source)

“I don’t have to tell you things are bad. Everybody knows things are bad.”*…

Further to a recent post on “seeing like a system” (and in a fashion, to last Monday’s post on misinformation), a provocative essay by Rohit Krishnan

We’re living through a phase change that is at the root of a lot of our societal problems. It’s the fact that our information networks have become much more dense.

You exist as a node in a network. Other people are other nodes. They send you information, the edges. You process it, you create your own. Information flows in all directions.

There are all sorts of networks. If you imagine all of us as nodes and the information we receive from each other via edges, then the shape of the network defines the type of information that spreads.

When the type of information is extremely tantalising, one that spreads fast, then the whole network gets taken over with that information. And there’s even a tipping point at which the information breaks containment and spreads through the whole network. Here’s an excellent essay on the subject…

When there are a lot of neighbours to which a node is connected, then various types of information spreads much faster through the network. This is called network density, how many edges are connected on average to each node. And increased density means that there’s many more routes by which information can spread.

This is why cities have much higher rates of “cultural transmission” compared to rural areas. Or why in domains like fashion or ideas or innovation or language or even food, the speed of change and variety is much higher in cities. Because each new unit of culture can transmit from person to person so much faster when there are more people it can connect to.

Ok, this is basics about what a network is. But what happens when the entire world gets interconnected such that we’re all connected to each other much more densely? What would have changed in ourselves? Our culture?

Historically, you used to only have a few sources of news or information. Things that percolated through to your network or things you read in the news. Now information comes from all sides, hungry for your attention. And your “processing power” to make sense of this information hasn’t meaningfully changed.

Imagine seeing the world from this vantage point. A blinding array of data streaming at you, standing as a node. Some from near, some from afar. You take it all in, process it, and build up a sense of the world from it, including a sense of the other people and their beliefs reflecting back, from near and far.

If you do this, as a sentient being, you can’t help but develop a world model. A sense of the world, a sense of what others think of the world, perhaps even another layer or two. Even if you develop it only to help speed up the information processing that you need to do. It will be different in structure for each of you, of course, but part of a shared consensus reality nonetheless. A sense made up of all the information that came your way, including the sense you have of all the sense of information that came other people’s way, which helps them process the information flowing their way. Creating a collective sense of what’s going on, a knowledge of the shared reality in which we live.

We call this our culture…

Culture is the digital biosphere we create for ourselves. Culture is the infosphere we all swim in. If you think of the information that we all swap with each other as water, culture is the ocean made up of it, or a distilled version of the most common or communally known parts of that ocean.

Edward Hall, in his books Silent Language and Beyond Culture, writes about how culture is composed of the communication patterns, behaviours, and symbols that are shared amongst a group. We can think of culture as the common interconnected web that underlay the beliefs that we all hold, which constantly changes and evolves as our beliefs spread.

This is especially salient because part of the culture now is filled with efforts of many to escape it. This isn’t new, of course, and have existed since Thoreau. But there is an increase in it. People try tools for thought and software to recommend things or remember things, or AI to remember everything they read and interacted with, all so that there can be a way to deal with the information avalanche.

While this makes sense for an individual, the fact that this collectively defines the information ecosystem around us also means that the problem is on the supply side. This is why there is the rise of private spaces, especially since Twitter’s demise. Hence the theory that we live in the dark forest era of the internet:

Is our universe an empty forest or a dark one? If it’s a dark forest, then only Earth is foolish enough to ping the heavens and announce its presence. The rest of the universe already knows the real reason why the forest stays dark. It’s only a matter of time before the Earth learns as well.

This is also what the internet is becoming: a dark forest.

In response to the ads, the tracking, the trolling, the hype, and other predatory behaviors, we’re retreating to our dark forests of the internet, and away from the mainstream.

The push to create private spaces, on discord or group chats, to truly express oneself or let mini-ecosystems flourish, they all are needed to make us stop sitting with our face deep inside the information superhighway. It’s to help make the networks you’re in a bit sparser.

And my thesis is that almost everything that we see that pushes against the cultural state we used to recognise, is as a result of this densification of our information networks. Whether it is group chats, private forums, discord, smartphones that are not that smart, yearning for the flip phones from Motorola, various tools for thought, AI software to help you focus, better note taking tools, the angst against media headlines, the disbelief in economic and political institutions, the underlying sense of malaise that seemingly everyone feels, the vibes.

And the way we’re interconnected changes the way we see and process information that comes our way, which changes the culture. The form of the network changes the way the network operates.

In the era of superfast many-to-many communication, the ideas that spread are the ones which can “take over” the entire spectrum. Small ideas grow, wither, die. It’s only the memetic megafauna that survive.

In a sparse network you might have pockets which retain their individuality and survive for longer. Like Galapagos syndrome for ideas. Both good and bad.

Letting your ecosystem interact with the external giant internet might mean it will die out or get outcompeted…

… The outcome of having a dense network is insidious but powerful. It means only the narratives which can go viral do go viral. The collective epistemic commons becomes filled with those narratives which outcompete the others and muscle their way to the top. It means that at a time of unprecedented low unemployment, high wages, high standard of living, GDP growth, high stock markets, strong dollar, people in the US still think they’re living in the worst of all possible times. An anti-Panglossian sentiment.

It means that everyone is convinced everyone else is having a bad time. We’re surrounded by vibes about everyone’s life getting worse, wars and famine and pestilence and injustice worldwide. We see gaffes and mistakes made by everyone laid bare instantly.

The denser network makes the impact of the message change. It molds itself to the medium.

At some level of interconnectivity, we all fall prey to the weaknesses of information deluge. Our attention is finite and so is our processing capacity for information. You can have the world do a denial of service attack on your cognition by overwhelming it with bits of information, so you’re stuck in place like a fly in amber. And it does this so easily that we haven’t even recognised when it happens let alone how to prevent it.

That too is why we have so many tools for thought, and ways to capture notes to search them afterwards, and tools for doing work about work, and endless lists and notes and contextual reminders and and and … It’s why we yearn for cultural islanding. It’s why there’s the never-ending “current thing”. We’re all left tilting at the windmill of being a node in a dense network…

Dense networks, dark forests: “Seeing Like a Network,” from @krishnanrohit.

(Image above: source)

* “Howard Beale” (Peter Finch), Network

###

As we deal with density, we might recall that it was on this date in 1974, at 8:01a, that the bar code made its retail debut: the first OPC-coded item, a ten pack of Wrigley’s chewing gum, was canned at the Marsh supermarket in Troy, Ohio by cashier Sharon Buchanan for customer Clyde Dawson.

Bar codes are now, of course, ubiquitous in retail, but are also widely used in healthcare, transport and logistics, mail, parcel, and baggage handling, and ticketing.

(source)

Written by (Roughly) Daily

June 26, 2024 at 1:00 am

“What shall we do if we take ignorance to be knowledge?”*…

A.W. Ohlheiser on a new book, Invisible Rulers— an investigation of the ways in which technology has transformed power and influence– by Renée DiResta, a leader of the late, lamented Stanford Internet Observatory

… The book examines and contextualizes how bad information and “bespoke realities” became so powerful and prominent online. She charts how the “collision of the rumor mill and the propaganda machine” on social media helped to form a trinity of influencer, algorithm, and crowd that work symbiotically to catapult pseudo-events, Twitter Main Characters, and conspiracy theories that have captured attention and shattered consensus and trust. 

DiResta’s book is part history, part analysis, and part memoir, as it spans from pre-internet examinations of the psychology of rumor and propaganda to the biggest moments of online conspiracy and harassment from the social media era. In the end, DiResta applies what she’s learned in a decade of closely researching online disinformation, manipulation, and abuse, to her personal experience of being the target of a series of baseless accusations that, despite their lack of evidence, prompted Rep. Jim Jordan, as chair of the House subcommittee on Weaponization of the Federal Government, to launch an investigation

There’s a really understandable instinct that, I think, a lot of people have when they read about online misinformation or disinformation: They want to know why it’s happening and who is to blame, and they want that answer to be easy. Hence, meme-ified arguments about “Russian bots” causing Trump to win the presidential election in 2016. Or, perhaps, pushes to deplatform one person who went viral by saying something wrong and harmful. Or the belief that we can content-moderate our way out of online harms altogether.  

DiResta’s book explains why these approaches will always fall short. Blaming the “algorithm” for a dangerous viral trend might feel satisfying, but the algorithm has never worked without human choice. As DiResta writes, “virality is a collective behavior.” Algorithms can surface and nudge and entangle, but they need user data to do it effectively…

…So, what would work?

DiResta’s ideas for this echo conversations that have been happening among misinformation experts for some time. There are some things platforms absolutely should be doing from a moderation standpoint, like removing automated trending topics, introducing friction to engaging with some online content, and generally giving users more control over what they see in their feeds and from their communities. DiResta also notes the importance of education and prebunking, which is a more preventative version of addressing false information that focuses on the tactics and tropes of online manipulation. Also, transparency…  

Misinformation: “Why lying on the internet keeps working,” @abbyohlheiser in @voxdotcom.

DiResta on her experience of harassment while at the Internet Observatory: “My Encounter with the Fantasy-Industial Complex” (gift article)

For more on prebunking, see “Fact or Fake? The role of knowledge neglect in misinformation” (source of the image above.

Also apposite: “Is social media fueling political polarization?

On the other hand, there’s this consideration of misinformation in the larger epistemological context of all of the information available to us: “How Dangerous is Misinformation?“: “The problem with alarmism about “misinformation” is not that it is too pessimistic about the state of media and public discourse. The problem is that it is not pessimistic enough.” Caveat lector.

* Neil Postman, Amusing Ourselves to Death

###

As we steel ourselves, we might recall that it was on this date in 1947 that Kenneth Arnold reported a UFO sighting; Arnold claimed that he saw a string of nine, shiny unidentified flying objects flying past Mount Rainier (in Washington State) at speeds that he estimated at a minimum of 1,200 miles an hour.

His was the first post-World War II sighting in the United States that attracted nationwide news coverage and is credited with being the first of the modern era of UFO sightings– including numerous reported sightings over the next two to three weeks. Arnold’s description of the objects led the press quickly to coin the terms flying saucer and flying disc as popular descriptive terms for UFOs.

After the 1947 UFO sighting, Arnold became famous “practically overnight.” Arnold’s daughter would later recall the family receiving 10,000 letters and constant phone calls. In the 1960’s Arnold entered politics, running as a Republican for Lieutenant Governor of Idaho. He lost to the Democratic incumbent.

source

“The most effective way to destroy people is to deny and obliterate their own understanding of their history”*…

… and of their present. Anne Applebaum explores the ways in which autocrats in China, Russia, and elsewhere are now making common cause with MAGA Republicans to discredit liberalism and freedom around the world…

… Even in a state where surveillance is almost total, the experience of tyranny and injustice can radicalize people. Anger at arbitrary power will always lead someone to start thinking about another system, a better way to run society. The strength of these demonstrations, and the broader anger they reflected, was enough to spook the Chinese Communist Party into lifting the quarantine and allowing the virus to spread. The deaths that resulted were preferable to public anger and protest.

Like the demonstrations against President Vladimir Putin in Russia that began in 2011, the 2014 street protests in Venezuela, and the 2019 Hong Kong protests, the 2022 protests in China help explain something else: why autocratic regimes have slowly turned their repressive mechanisms outward, into the democratic world. If people are naturally drawn to the image of human rights, to the language of democracy, to the dream of freedom, then those concepts have to be poisoned. That requires more than surveillance, more than close observation of the population, more than a political system that defends against liberal ideas. It also requires an offensive plan: a narrative that damages both the idea of democracy everywhere in the world and the tools to deliver it…

… the story of how Africans—as well as Latin Americans, Asians, and indeed many Europeans and Americans—have come to spout Russian propaganda about Ukraine is not primarily a story of European colonial history, Western policy, or the Cold War. Rather, it involves China’s systematic efforts to buy or influence both popular and elite audiences around the world; carefully curated Russian propaganda campaigns, some open, some clandestine, some amplified by the American and European far right; and other autocracies using their own networks to promote the same language…

…the convergence of what had been disparate authoritarian influence projects is still new. Russian information-laundering and Chinese propaganda have long had different goals. Chinese propagandists mostly stayed out of the democratic world’s politics, except to promote Chinese achievements, Chinese economic success, and Chinese narratives about Tibet or Hong Kong. Their efforts in Africa and Latin America tended to feature dull, unwatchable announcements of investments and state visits. Russian efforts were more aggressive—sometimes in conjunction with the far right or the far left in the democratic world—and aimed to distort debates and elections in the United States, the United Kingdom, Germany, France, and elsewhere. Still, they often seemed unfocused, as if computer hackers were throwing spaghetti at the wall, just to see which crazy story might stick. Venezuela and Iran were fringe players, not real sources of influence.

Slowly, though, these autocracies have come together, not around particular stories, but around a set of ideas, or rather in opposition to a set of ideas. Transparency, for example. And rule of law. And democracy. They have heard language about those ideas—which originate in the democratic world—coming from their own dissidents, and have concluded that they are dangerous to their regimes…

The origins and the operations of today’s all-too-successful authoritarian disinformation efforts: “The New Propaganda War” (gift article) from @anneapplebaum in @TheAtlantic. Eminently worth reading in full.

Apposite: “‘Everyone is absolutely terrified’: Inside a US ally’s secret war on its American critics,” @zackbeauchamp on India’s campaign to threaten and discredit critics of the Modi regime, in @voxdotcom. Plus: “India’s YouTubers take on Narendra Modi” (gift link to @TheEconomist).

* George Orwell

###

As we analyze agitprop, we might recall that it was on this date in 1998 that Michael Fortier was sentenced to 12 years in prison and fined $200,000 for failing to warn authorities about the plot to bomb the Alfred P. Murrah Federal Building in Oklahoma City.

Carried out by right-wing (white supremacist- and militia-sympathizing) anti-government extremists Timothy McVeigh and Terry Nichols, the bombing (on April 19, 1995, at 9:02 AM) killed 168 people, injured 680, and destroyed more than one-third of the building, which had to be demolished. The blast destroyed or damaged 324 other buildings and caused an estimated $652 million worth of damage. It was the deadliest act of terrorism in U.S. history before the September 11 attacks in 2001, and still the deadliest act of domestic terrorism in U.S. history.

McVeigh had shared his plans with Fortier (his Army roommate); Fortier had accompanied McVeigh on a scouting trip to the building in advance of the blast; and Fortier had failed to warn officials of the attack.

The Alfred P. Murrah Federal Building two days after the bombing, viewed from across the adjacent parking lot (source)