(Roughly) Daily

Posts Tagged ‘systems

“Life is really simple, but we insist on making it complicated”*…

One of the dominant themes of the last few years is that nothing makes sense. Donald Trump is president, QAnon has mainstreamed fringe conspiracy theories, and hundreds of thousands are dead from a pandemic and climate change while many Americans do not believe that the pandemic or climate change are deadly. It’s incomprehensible.

I am here to tell you the the reason that so much of the world seems incomprehensible is that it is incomprehensible. From social media to the global economy to supply chains, our lives rest precariously on systems that have become so complex, and we have yielded so much of it to technologies and autonomous actors that no one totally comprehends it all.

In other words: No one’s driving. And if we hope to retake the wheel, we’re going to have to understand, intimately, all of the ways we’ve lost control…

The internet might be the system that we interact with in the most direct and intimate ways, but most of us have little comprehension of what lies behind our finger-smudged touchscreens, truly understood by few. Made up of data centers, internet exchanges, huge corporations, tiny startups, investors, social media platforms, datasets, adtech companies, and billions of users and their connected devices, it’s a vast network dedicated to mining, creating, and moving data on scales we can’t comprehend. YouTube users upload more than 500 hours of video every minute — which works out as 82.2 yearsof video uploaded to YouTube every day. As of June 30, 2020, there are over 2.7 billion monthly active Facebook users, with 1.79 billion people on average logging on daily. Each day, 500 million tweets are sent— or 6,000 tweets every second, with a day’s worth of tweets filling a 10-million-page book. Every day, 65 billion messages are sent on WhatsApp. By 2025, it’s estimated that 463 million terabytes of data will be created each day — the equivalent of 212,765,957 DVDs…

What we’ve ended up with is a civilization built on the constant flow of physical goods, capital, and data, and the networks we’ve built to manage those flows in the most efficient ways have become so vast and complex that they’re now beyond the scale of any single (and, arguably, any group or team of) human understanding them. It’s tempting to think of these networks as huge organisms, with tentacles spanning the globe that touch everything and interlink with one another, but I’m not sure the metaphor is apt. An organism suggests some form of centralized intelligence, a nervous system with a brain at its center, processing data through feedback loops and making decisions. But the reality with these networks is much closer to the concept of distributed intelligence or distributed knowledge, where many different agents with limited information beyond their immediate environment interact in ways that lead to decision-making, often without them even knowing that’s what they’re doing…

Ceding control to vast unaccountable networks not only risks those networks going off the rails, it also threatens democracy itself. If we are struggling to understand or influence anything more than very small parts of them, this is also increasingly true for politicians and world leaders. Like the captain of the container ship, politicians and voters have less and less control over how any of these networks run. Instead they find themselves merely managing very small parts of them — they certainly don’t seem to be able to make drastic changes to those networks (which are mainly owned by private corporate industries anyway) even though they have a very direct impact on their nations’ economies, policies, and populations. To paraphrase the filmmaker Adam Curtis, instead of electing visionary leaders, we are in fact just voting for middle managers in a complex, global system that nobody fully controls.

The result of this feels increasingly like a democratic vacuum. We live in an era where voters have record levels of distrust for politicians, partly because they can feel this disconnect — they see from everyday reality that, despite their claims, politicians can’t effect change. Not really. They might not understand why, exactly, but there’s this increasing sense that leaders have lost the ability to make fundamental changes to our economic and social realities. The result is a large body of mainstream voters that wants to burn down the status quo. They want change, but don’t see politicians being able to deliver it. It feels like they’re trapped in a car accelerating at full throttle, but no one is driving.

They may not be able to do much about it, but there are mainstream politicians and elected leaders who see this vacuum for what it is — and see how it provides them with a political opportunity. Figures like Donald Trump and Boris Johnson certainly don’t believe in patching up the failures of this system — if anything, they believe in accelerating the process, deregulating, handing more power to the networks. No, for them this is a political vacuum that can be filled with blame. With finger-pointing and scapegoating. It is an opportunity to make themselves look powerful by pandering to fears, by evoking nationalism, racism, and fascism.

Donald Trump has still not conceded the 2020 election despite Joe Biden’s clear victory, and is leaning in part on the fact that the United States has a complex and sometimes opaque voting system that most of the public doesn’t understand to spread conspiracy theories about glitchy or malfeasant voting machines switching or deleting millions of votes. It’s perhaps no coincidence that some of the highest-profile figures on the right — like ex-Trump-adviser Steve Bannon or Brexit Party leader Nigel Farage — have backgrounds in the financial industry. These are political players who have seen how complicated things have become and can sense the gap in public comprehension but want to fill it with chaos and conspiracies rather than explanations…

As Tim Maughan (@TimMaughan) explains, vast systems, from automated supply chains to high-frequency trading, now undergird our daily lives — and we’re losing control of all of them: “The Modern World Has Finally Become Too Complex for Any of Us to Understand” (the first of a series of monthly columns that will “locate ways that we can try to increase our knowledge of the seemingly unknowable, as well as find strategies to counter the powerlessness and anxiety the system produces”).

* Confucius

###

As we contemplate complexity, we might send emergent birthday greetings to Per Bak; he was born on this date in 1948. A theoretical physicist, he is credited with developing the concept (and coining the name) of “self-organized criticality,” an explanation of how very complex phenomena (like consciousness) emerge from the interaction of simple components.

source

“Fools ignore complexity. Pragmatists suffer it… Geniuses remove it.”*…

 

complexity

 

World War II bomber planes returned from their missions riddled with bullet holes. The first response was, not surprisingly, to add armor to those areas most heavily damaged. However, the statistician Abraham Wald made what seemed like the counterintuitive recommendation to add armor to those parts with no damage. Wald had uniquely understood that the planes that had been shot where no bullet holes were seen were the planes that never made it back. That’s, of course, where the real problem was. Armor was added to the seemingly undamaged places, and losses decreased dramatically.

The visible bullet holes of this pandemic are the virus and its transmission. Understandably, a near-universal response to the COVID-19 pandemic has been to double down on those disciplines where we already possess deep and powerful knowledge: immunology and epidemiology. Massive resources have been directed at combating the virus by providing fast grants for disciplinary work on vaccines. Federal agencies have called for even more rapid response from the scientific community. This is a natural reaction to the immediate short-term crisis.

The damage we are not attending to is the deeper nature of the crisis—the collapse of multiple coupled complex systems.

Societies the world over are experiencing what might be called the first complexity crisis in history. We should not have been surprised that a random mutation of a virus in a far-off city in China could lead in just a few short months to the crash of financial markets worldwide, the end of football in Spain, a shortage of flour in the United Kingdom, the bankruptcy of Hertz and Niemann-Marcus in the United States, the collapse of travel, and to so much more.

As scientists who study complex systems, we conceive of a complexity crisis as a twofold event. First, it is the failure of multiple coupled systems—our physical bodies, cities, societies, economies, and ecosystems. Second, it involves solutions, such as social distancing, that involve unavoidable tradeoffs, some of which amplify the primary failures. In other words, the way we respond to failing systems can accelerate their decline.

We and our colleagues in the Santa Fe Institute Transmission Project believe there are some non-obvious insights and solutions to this crisis that can be gleaned from studying complex systems and their universal properties…

The more complicated and efficient a system gets, the more likely it is to collapse altogether.  Scientists who study complex systems offer solutions to the pandemic: “The Damage We’re Not Attending To.”

See also: “Complex Systems Theory Explains Why Covid Crushed the World.”

* Alan Perlis

###

As we think systemically, we might recall that it was on this date in 1835 that the New York Sun began a series of six articles detailing the discovery of civilized life on the moon.  Now known as “The Great Moon Hoax,” the articles attributed the “discovery” to Sir John Herschel, the greatest living astronmer of the day.  Herschel was initially amused, wryly noting that his own real observations could never be as exciting.  But ultimately he tired of having to answer questioners who believed the story.  The series was not discovered to be a hoax for several weeks after its publication and, even then, the newspaper did not issue a retraction.

The “ruby amphitheater” on the Moon, per the New York Sun (source)

 

“One ingredient of many fiascos is that great, massive, heart-wrenching chaos and failure are more likely to occur when great ambitions come into play”*…

 

J Park

 

There’s a term for when a single hiccup triggers a chain reaction that makes everything go absolutely, altogether, totally, and undeniably wrong, causing a large and intricate system to collapse on itself. On the street you might call it a fiasco—but in more formal parlance it’s called “cascading failure.”

Sound familiar? If you’ve been through an electrical grid outage, there’s a good chance you’ve heard it in that context. It’s not a new phenomenon, but it’s a relatively recent term, and the complexity of modern life has multiplied the real-life scenarios for its use in the fields of technology, biology, and finance. The easiest way to think about cascading failure is as a line of tumbling dominoes—or the plot of Jurassic Park, a blockbuster about how the smallest of errors can lead to total catastrophe…

Epic power blackouts, the “flash crash,” coronavirus response, and so much more: “Cascading Failure.”

* Ira Glass, in his introduction to “Opening Night,” Act One of the This American Life episode “Fiasco“… and the funniest 21 minutes of radio your correspondent has ever heard.

###

As we consider causation, we might recall that it was on this date in 1986, at 11:10p, that operators at the the Chernobyl Nuclear Power Plant in the Ukraine received the go-ahead to commence a safety test that was scheduled to coincide with a routine shutdown for maintenance.  Just over two hours later, an unexpected power surge triggered what we now know as “The Chernobyl disaster”– considered, even after Fukushima, the worst nuclear catastrophe in history.  It killed 31 people directly, including 28 workers and firefighters who died of acute radiation poisoning during the cleanup.  Experts believe it likewise caused thousands of premature cancer deaths, though the exact number is disputed.  To this day, the area around the plant remains so contaminated that it’s officially closed to human habitation.

chernobyl-gettyimages-110170725

A view of the facility three days after the incident

source

 

Written by LW

April 25, 2020 at 1:01 am

“Democracy is never a final achievement. It is a call to an untiring effort.”*…

 

770px-Diagram_of_the_Federal_Government_and_American_Union_edit

Diagram of U.S. governance, 1862 [source and larger version]

 

The Roman Empire, the Iroquois Confederacy, and the United States of America are human inventions as surely as airplanes, computers, and contraception are. Technology is how we do things, and political institutions are how we collaborate at scale. Government is an immensely powerful innovation through which we take collective action.

Just like any other technology, governments open up new realms of opportunity. These opportunities are morally neutral: humans have leveraged political institutions to provide public eduction and to murder ethnic minorities. Specific features like explicit protections for human rights and civil liberties are designed to help mitigate certain downside risks.

Like any tool, systems of governance require maintenance to keep working. We expect regular software updates, but forget that governance is also in constant flux, and begins to fail when it falls out of sync with the culture. Without preventative maintenance, pressure builds like tectonic forces along a fault line until a new order snaps into place, often violently…

Widespread adoption renders technology invisible, its ubiquity revealed only when it breaks. That’s why science fiction plots so often hinge on systems breaking, and explains Wired Senior Maverick Kevin Kelly’s approach to futurism: “I’m looking for the places where technology is abused, misused, or unsupervised in order to get a glimpse of its natural inherent leanings. Where the edges go, the center follows later.”

If you’re worried about the demise of democracy because you see how the system is being abused, congratulations! You have just discovered a way to make democracy stronger. Ask any programmer: Nothing clarifies software development like a major bug report. Follow that edge. Sharpen, blunt, or redirect it as necessary. The center will follow…

Critically-acclaimed novelist and essayist Eliot Peper (@eliotpeper) argues for regular maintenance and upgrades: “Government is a technology, so fix it like one.”

* John F. Kennedy

###

As we undertake an upgrade, we might spare a thought for Marcus Tullius Cicero; he died on this date in 43 BCE.  A  Roman philosopher, politician, lawyer, political theorist, consul and constitutionalist, Cicero was one of Rome’s greatest orators and prose stylists.  His influence on the Latin language was immense: it has been said that subsequent prose was either a reaction against or a return to his style, not only in Latin but also (after Petrarch’s rediscovery of Cicero’s work) in European languages up to the 19th century.

A champion of Republican government in Rome, he spoke against the second Catilinarian conspiracy, against  Julius Caesar, then– even more eloquently– against Mark Antony.  He was executed in 43 BCE (his head and hands were amputated and displayed to the public) for his Philippics, a series of speeches attacking Antony and calling (again) for a restoration of the Republic.  Sic semper prōtestor?

https://i0.wp.com/farm9.staticflickr.com/8478/8243527748_ab6d5f6fa9_o.jpg source

 

Written by LW

December 7, 2019 at 1:01 am

“How about a little magic?”*…

 

sorcerers apprentice

 

Once upon a time (bear with me if you’ve heard this one), there was a company which made a significant advance in artificial intelligence. Given their incredibly sophisticated new system, they started to put it to ever-wider uses, asking it to optimize their business for everything from the lofty to the mundane.

And one day, the CEO wanted to grab a paperclip to hold some papers together, and found there weren’t any in the tray by the printer. “Alice!” he cried (for Alice was the name of his machine learning lead) “Can you tell the damned AI to make sure we don’t run out of paperclips again?”…

What could possibly go wrong?

[As you’ll read in the full and fascinating article, a great deal…]

Computer scientists tell the story of the Paperclip Maximizer as a sort of cross between the Sorcerer’s Apprentice and the Matrix; a reminder of why it’s crucially important to tell your system not just what its goals are, but how it should balance those goals against costs. It frequently comes with a warning that it’s easy to forget a cost somewhere, and so you should always check your models carefully to make sure they aren’t accidentally turning in to Paperclip Maximizers…

But this parable is not just about computer science. Replace the paper clips in the story above with money, and you will see the rise of finance…

Yonatan Zunger tells a powerful story that’s not (only) about AI: “The Parable of the Paperclip Maximizer.”

* Mickey Mouse, The Sorcerer’s Apprentice

###

As we’re careful what we wish for (and how we wish for it), we might recall that it was on this date in 1631 that the Puritans in the recently-chartered Massachusetts Bay Colony issued a General Court Ordinance that banned gambling: “whatsoever that have cards, dice or tables in their houses, shall make away with them before the next court under pain of punishment.”

Mass gambling source

 

Written by LW

March 22, 2019 at 1:01 am

%d bloggers like this: