(Roughly) Daily

Posts Tagged ‘knowledge

“There is nothing new except what has been forgotten”*…

 

galaxy

 

Many galaxies would fly apart if they had as much mass as estimates based on their visible signature suggest. Although some have posited alternative theories of gravitation to explain this discrepancy, most physicists now hypothesize the existence of mass-bearing particles that are not detectable through emitted radiation such as visible light. We call these particles dark matter, and it is estimated to compose about 85% of all matter in the observable universe.

In analyzing the functional institutions of our society, we are not able to see for ourselves most of the knowledge that created them. Knowledge of this sort includes trade secrets, tacit technical knowledge, private social networks, private intelligence-gathering operations, management and persuasive skill, cooperation and collusion among founders and their allies, and founders’ long-term plans for their institutions.¹

This knowledge has profound effects on the social landscape. We must understand it if we hope to understand society. We therefore must examine intellectual dark matter: knowledge we cannot see publicly, but whose existence we can infer because our institutions would fly apart if the knowledge we see were all there was.² Such intellectual dark matter rests at the foundations of our society, dwarfing in scope and importance the accessible, shareable, visible knowledge on which we normally focus.

There are many forms of intellectual dark matter, but the three principal ones are lost, proprietary, and tacit knowledge…

Knowledge that we can show exists, but cannot directly access, rests at the foundations of society and technology.  Samo Burja explains how– and why it matters: “Intellectual Dark Matter.”

* Marie Antoinette

###

As we contemplate comprehension, we might recall that it was on this date in 1587 that a group led by John White established the Roanoke Colony in what is now Dare County, North Carolina.  It was in fact the second colony there:  in 1585, 107 men had been left to establish a presence on Roanoke Island.  White and his crew were actually sailing to Chesapeake Bay, but stopped to check on the Roanoke group.  When they arrived, they found no one.  The master pilot of the expedition insisted that White and his crew of 115 men and women (re-)found the colony at Roanoke.

After years of difficulty, the group persuaded White to return to England to ask for help.  He did, but was delayed in returning by the on-going war with the Spanish, when he finally returned, in 1590, he found no trace of the colony–all inhabitants, including his grand-daughter, Virginia Dare, the first child born in Roanoke Colony, thus the first England child born in the New World, were gone, leaving behind a single word, “Croatoan,” carved on a tree.  It is believed that they attempted to migrate to Croatoan Island (near Cape Hatteras), and were absorbed into the Croatan tribe there.  In any case the story of the Lost Colony was born…  though it fact, it was the Colony Lost Again.

…one of the chiefe trees or postes at the right side of the entrance had the barke taken off, and 5. foote from the ground in fayre Capitall letters was grauen CROATOAN without any crosse or signe of distresse

-Richard Hakluyt, from his description of the deserted settlement at Roanoke Island, August 18, 1590;  Principal Navigations, Voyages of the English Nation, Vol. III, 1600

White at the tree

 

 

“It is unwise to be too sure of one’s own wisdom. It is healthy to be reminded that the strongest might weaken and the wisest might err.”*…

 

Isaac_Newton_laboratory_fire

An 1874 engraving showing a probably apocryphal account of Newton’s lab fire. In the story, Newton’s dog started the fire, burning 20 years of research. Newton is thought to have said: “O Diamond, Diamond, thou little knowest the mischief thou hast done.”

 

Imagine a black box which, when you pressed a button, would generate a scientific hypothesis. 50% of its hypotheses are false; 50% are true hypotheses as game-changing and elegant as relativity. Even despite the error rate, it’s easy to see this box would quickly surpass space capsules, da Vinci paintings, and printer ink cartridges to become the most valuable object in the world. Scientific progress on demand, and all you have to do is test some stuff to see if it’s true? I don’t want to devalue experimentalists. They do great work. But it’s appropriate that Einstein is more famous than Eddington. If you took away Eddington, someone else would have tested relativity; the bottleneck is in Einsteins. Einstein-in-a-box at the cost of requiring two Eddingtons per insight is a heck of a deal.

What if the box had only a 10% success rate? A 1% success rate? My guess is: still most valuable object in the world. Even an 0.1% success rate seems pretty good, considering (what if we ask the box for cancer cures, then test them all on lab rats and volunteers?) You have to go pretty low before the box stops being great.

I thought about this after reading this list of geniuses with terrible ideas. Linus Pauling thought Vitamin C cured everything. Isaac Newton spent half his time working on weird Bible codes. Nikola Tesla pursued mad energy beams that couldn’t work. Lynn Margulis revolutionized cell biology by discovering mitochondrial endosymbiosis, but was also a 9-11 truther and doubted HIV caused AIDS. Et cetera. Obviously this should happen. Genius often involves coming up with an outrageous idea contrary to conventional wisdom and pursuing it obsessively despite naysayers. But nobody can have a 100% success rate. People who do this successfully sometimes should also fail at it sometimes, just because they’re the kind of person who attempts it at all. Not everyone fails. Einstein seems to have batted a perfect 1000 (unless you count his support for socialism). But failure shouldn’t surprise us…

Some of the people who have most contributed to our understanding of the world have been inexcusably wrong on basic issues.  But, as Scott Alexander argues, you only need one world-changing revelation to be worth reading: “Rule Thinkers In, Not Out.”

* Mahatma Gandhi

###

As we honor insight where we find it, we might send carefully-addressed birthday greetings to Infante Henrique of Portugal, Duke of Viseu, better known as Prince Henry the Navigator; he was born on this date in 1394.  A central figure in 15th-century Portuguese politics and in the earliest days of the Portuguese Empire, Henry encouraged Portugal’s expeditions (and colonial conquests) in Africa– and thus is regarded as the main initiator (as a product both of Portugal’s expeditions and of those that they encouraged by example) of what became known as the Age of Discoveries.

 source

 

 

“Belief can be manipulated. Only knowledge is dangerous.”*…

 

diderot

 

Denis Diderot and the encyclopedists had a plan to catalog knowledge that seemed harmless enough; but what they intended was far more subversive– to restructure knowledge itself:

Far more influential and prominent than the short single-authored works that Diderot had produced up to this point in his life, the Encyclopédie was expressly designed to pass on the temptation and method of intellectual freedom to a huge audience in Europe and, to a lesser extent, in faraway lands like Saint Petersburg and Philadelphia. Ultimately carried to term through ruse, obfuscation, and sometimes cooperation with the authorities, the Encyclopédie (and its various translations, republications, and pirated excerpts and editions) is now considered the supreme achievement of the French Enlightenment: a triumph of secularism, freedom of thought, and eighteenth-century commerce…

At first glance, [Diderot’s] large map of topics, which ranged from comets to epic poetry, seems quite inoffensive. Indeed, the Encyclopédie’s earliest critic, the Jesuit priest Guillaume-François Berthier, did not quibble with how Diderot had organized the “System”; he simply accused Diderot of stealing this aspect of Bacon’s work without proper acknowledgment. Diderot’s real transgression, however, was not following the English philosopher more closely. For, while it was true that Diderot freely borrowed the overall structure of his tree of knowledge from Bacon, he had actually made two significant changes to the Englishman’s conception of human understanding. First, he had broken down and subverted the traditional hierarchical relationship between liberal arts (painting, architecture, and sculpture) and “mechanical arts” or trades (i.e., manual labor). Second, and more subversively, he had shifted the category of religion squarely under humankind’s ability to reason. Whereas Bacon had carefully and sagely preserved a second and separate level of knowledge for theology outside the purview of the three human faculties, Diderot made religion subservient to philosophy, essentially giving his readers the authority to critique the divine…

The only other subject more problematic than religion was politics. In a country without political parties, where sedition was punished by sentencing to a galley ship or death, d’Alembert and Diderot never overtly questioned the spiritual and political authority of the monarchy. Yet the Encyclopédie nonetheless succeeded in advancing liberal principles, including freedom of thought and a more rational exercise of political power. As tepid as some of these writings may seem when compared with the political discourse of the Revolutionary era, the Encyclopédie played a significant role in destabilizing the key assumptions of Absolutism.

Diderot’s most direct and dangerous entry in this vein was his unsigned article on “Political Authority” (“Autorité politique”), which also appeared in the first volume of the Encyclopédie. Readers who chanced upon this article immediately noticed that it does not begin with a definition of political authority itself; instead, it opens powerfully with an unblemished assertion that neither God nor nature has given any one person the indisputable authority to reign…

From a fascinating excerpt of Andrew S. Curran’s  Diderot and the Art of Thinking Freely.  Read the piece in full at “How Diderot’s Encyclopedia Challenged the King.”

* Frank Herbert

###

As we note that knowledge is power, we might recall that it was on this date in 1920 that the League of Women Voters was founded.  Created to support women’s suffrage, it remains nonpartisan, neither supporting nor opposing candidates or parties, and advocating for (now more broadly understood) voting rights and for campaign finance reform.  The League sponsored the Presidential debates in 1976, 1980, and 1984, but withdrew in 1988, when the demands of the two parties became untenable. Then-LWV President Nancy Neuman said that the debate format on which the parties were insisting would “perpetrate a fraud on the American voter” and that her organization did not intend to “become an accessory to the hoodwinking of the American public.”

200px-LWV_Logo.svg source

 

Written by LW

February 14, 2019 at 1:01 am

“In the modern world the stupid are cocksure while the intelligent are full of doubt”*…

 

russell

Bertrand Russell’s quip prefigured the scientific discovery of a cognitive bias—the Dunning–Kruger effect—that has been so resonant that it has penetrated popular culture.

 

Dismayed at the Nazification of Germany, the philosopher [Bertrand Russell] wrote “The Triumph of Stupidity,” attributing the rise of Adolf Hitler to the organized fervor of stupid and brutal people—two qualities, he noted, that “usually go together.” He went on to make one of his most famous observations, that the “fundamental cause of the trouble is that in the modern world the stupid are cocksure while the intelligent are full of doubt.”

Russell’s quip prefigured the scientific discovery of a cognitive bias—the Dunning–Kruger effect—that has been so resonant that it has penetrated popular culture, inspiring, for example, an opera song (from Harvard’s annual Ig Nobel Award Ceremony): “Some people’s own incompetence somehow gives them a stupid sense that anything they do is first rate. They think it’s great.” No surprise, then, that psychologist Joyce Ehrlinger prefaced a 2008 paper she wrote with David Dunning and Justin Kruger, among others, with Russell’s comment—the one he later made in his 1951 book, New Hopes for a Changing World: “One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision.” “By now,” Ehrlinger noted in that paper, “this phenomenon has been demonstrated even for everyday tasks, about which individuals have likely received substantial feedback regarding their level of knowledge and skill.” Humans have shown a tendency, in other words, to be a bit thick about even the most mundane things, like how well they drive…

But what exactly is stupidity? David Krakauer, the President of the Santa Fe Institute, told interviewer Steve Paulson, for Nautilus, stupidity is not simply the opposite of intelligence. “Stupidity is using a rule where adding more data doesn’t improve your chances of getting [a problem] right,” Krakauer said. “In fact, it makes it more likely you’ll get it wrong.” Intelligence, on the other hand, is using a rule that allows you to solve complex problems with simple, elegant solutions. “Stupidity is a very interesting class of phenomena in human history, and it has to do with rule systems that have made it harder for us to arrive at the truth,” he said. “It’s an interesting fact that, whilst there are numerous individuals who study intelligence—there are whole departments that are interested in it—if you were to ask yourself what’s the greatest problem facing the world today, I would say it would be stupidity. So we should have professors of stupidity—it would just be embarrassing to be called the stupid professor.”

Stupidity, and what to do about it: “The Case for Professors of Stupidity.”

* Bertrand Russell, “The Triumph of Stupidity” (1933)

###

As we get smart, we might spare a thought for Pyotr Alexeyevich Kropotkin; he died on this date in 1921.  A scientist and geographer, he combined biological and historical fact to arrive at his theory of Mutual Aid.  While an army officer in Siberia, he studied the native animals, made geographical surveys, and examined the effects of the Ice Age in Asia and Europe.  His investigation of the structural lines of mountain ranges revised the cartography of eastern Asia.

But Kropotkin is probably better remembered as a revolutionary.  He wrote a series of articles against social Darwinism and its tenet of the benefits of competition.  Kropotkin argued that sociability characterized animals; thus, he held, cooperation rather than struggle guided the evolution of man and human intelligence.  These beliefs led him to propose a decentralized, “communist” society– a form of anarcho-communism, his championing of which led to his 41 year exile from Russia.  He returned in 1917, but was disappointed in the Bolshevik form of state socialism (the centralization– and totalitarian quality– of which violently conflicted with his belief in decentralization, freedom, and voluntary cooperation).

220px-Peter_Kropotkin_circa_1900 source

 

“For most of human history, ‘literature’… has been narrated, not written — heard, not read”*…

 

Bradshaw_rock_paintings

Bradshaw rock paintings help Aboriginal people record knowledge to memory

 

In preliterate societies, oral stories were likewise relied upon as necessary and meaningful—and they conveyed a range of knowledge and human experiences. In some instances, particularly in harsh environments like Australia where certain information was key to survival, rigid methods of intergenerational knowledge transfer were in place. Essential knowledge, such as that for finding water and shelter, or for knowing what food was present where, was passed down along patriarchal lines but routinely cross-checked for accuracy and completeness between those lines.

But knowledge was also exchanged from generation to generation through song, dance, and performance. Geography and history in Aboriginal Australian societies were told as people moved along songlines, which were remembered routes across the land. Their memories were prompted by particular landforms. Even ancient rock art may have been created as memory aids, prompts to help storytellers recall particular pieces of information. Today many Aboriginal groups keep alive their ancient memories of songlines.

Such oral traditions could be viewed as “books” that were kept in the mental libraries of those who had actually heard and memorized them. Knowledge was passed on by “reading” those books out loud to young people, some of whom memorized them and would later “read” them to others. And so these ancient stories are still alive today—from memorable events like the formation of Crater Lake or the drowning of land along the Australian fringe to information about the names of places and their associations.

Now pause to consider what this means.

Humanity has direct memories of events that occurred 10 millennia ago…

Evidence gathered in recent years shows that some ancient narratives contain remarkably reliable records of real events– records that have survived for thousands of years: “The Oldest True Stories in the World.”

See also: “How oral cultures memorise so much information” (the source of the image above).

* Angela Carter

###

As we sing along, we might recall that it was on this date in 1869 that The Aboriginal Protection Act was enacted in Britain’s colony of Victoria– Australia, as now we know it.  At a time when democratic reforms were being introduced for most of the population, (including the extension of the franchise from the wealthy to all adult males and the provision of free public education), Aboriginal people were losing their freedom: pursuant to the Act, the government developed controls over where Aboriginal people could live and work, what they could do, and who they could meet or marry.– and most horrifyingly, it removed Aboriginal children from their families, starting the process that created the Stolen Generation.  Still, the Aboriginal oral tradition has survived.

apa source

 

“The greatest enemy of knowledge is not ignorance, it is the illusion of knowledge”*…

 

40990080740_17170c03ec_z

After the fall of the Berlin Wall, East German citizens were offered the chance to read the files kept on them by the Stasi, the much-feared Communist-era secret police service. To date, it is estimated that only 10 percent have taken the opportunity.

In 2007, James Watson, the co-discoverer of the structure of DNA, asked that he not be given any information about his APOE gene, one allele of which is a known risk factor for Alzheimer’s disease.

Most people tell pollsters that, given the choice, they would prefer not to know the date of their own death—or even the future dates of happy events.

Each of these is an example of willful ignorance. Socrates may have made the case that the unexamined life is not worth living, and Hobbes may have argued that curiosity is mankind’s primary passion, but many of our oldest stories actually describe the dangers of knowing too much. From Adam and Eve and the tree of knowledge to Prometheus stealing the secret of fire, they teach us that real-life decisions need to strike a delicate balance between choosing to know, and choosing not to.

But what if a technology came along that shifted this balance unpredictably, complicating how we make decisions about when to remain ignorant? That technology is here: It’s called artificial intelligence.

AI can find patterns and make inferences using relatively little data. Only a handful of Facebook likes are necessary to predict your personality, race, and gender, for example. Another computer algorithm claims it can distinguish between homosexual and heterosexual men with 81 percent accuracy, and homosexual and heterosexual women with 71 percent accuracy, based on their picture alone. An algorithm named COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) can predict criminal recidivism from data like juvenile arrests, criminal records in the family, education, social isolation, and leisure activities with 65 percent accuracy…

Knowledge can sometimes corrupt judgment, and we often choose to remain deliberately ignorant in response.  But in an age of all-knowing algorithms, how do we choose not to know?  Two scientists at the Max Planck Institute for Human Development argue that “We Need to Save Ignorance From AI.”

* Daniel J. Boorstin

###

As we consider closing our eyes, we might send discoverable birthday greetings to Tim Bray; he was born on this date in 1955.  A seminal software developer and entrepreneur, he is probably best known as the co-author of the original specifications for the XML and XML namespace, open standards that fueled the growth of the internet (by setting down simple rules for encoding documents in a format that is both human-readable and machine-readable), and as the co-founder of the Open Text Corporation, which released the Open Text Index, one of the first popular commercial web search engines.

40990080840_2a593e7046_o source

 

Written by LW

June 21, 2018 at 1:01 am

“Knowledge, like air, is vital to life. Like air, no one should be denied it.”*…

 

Belgian information activist Paul Otlet (1927)

More than a century ago, Belgian information activist Paul Otlet envisioned a universal compilation of knowledge and the technology to make it globally available. He foresaw, in other words, some of the possibilities of today’s Web.

Otlet’s ideas provide an important pivot point in the history of recording knowledge and making it accessible. In classical times, the best-known example of the knowledge enterprise was the Library of Alexandria. This great repository of knowledge was built in the Egyptian city of Alexandria around 300 BCE by Ptolemy I and was destroyed between 48 BCE and 642 CE, supposedly by one or more fires. The size of its holdings is also open to question, but the biggest number that historians cite is 700,000 papyrus scrolls, equivalent to perhaps 100,000 modern books…

Any hope of compacting all we know today into 100,000 books—or 28 encyclopedic volumes—is long gone. The Library of Congress holds 36 million books and printed materials, and many university libraries also hold millions of books. In 2010, the Google Books Library Project examined the world’s leading library catalogs and databases. The project, which scans hard copy books into digital form, estimated that there are 130 million existing individual titles. By 2013, Google had digitized 20 million of them.

This massive conversion of books to bytes is only a small part of the explosion in digital information. Writing in the Financial Times, Stephen Pritchard notes that humanity generated almost 2 trillion gigabytes of varied data in 2011, an amount projected to double every two years, forming a growing trove of Big Data available on about 1 billion websites… Search engines let us trek some distance into this world, but other approaches can allow us to explore it more efficiently or deeply. A few have sprung up. Wikipedia, for instance, classifies Web content under subject headings…

But there is a bigger question: Can we design an overall approach that would reduce the “static” and allow anyone in the world to rapidly pinpoint and access any desired information? That’s the question Paul Otlet raised and answered—in concept if not in execution. Had he fully succeeded, we might today have a more easily navigable Web.

Otlet, born in Brussels, Belgium, in 1868, was an information science pioneer. In 1895, with lawyer and internationalist Henri La Fontaine, he established the International Institute of Bibliography, which would develop and distribute a universal catalog and classification system. As Boyd Rayward writes in the Journal of Library History, this was “no more and no less than an attempt to obtain bibliographic control over the entire spectrum of recorded knowledge.”…

The remarkable story in full at: “The internet before the internet: Paul Otlet’s Mundaneum.”

* Alan Moore, V for Vendetta

###

As we try to comprehend comprehensiveness, we might recall that it was on this date in 1985 that the first .com Internet domain, symbolics.com, was registered by Symbolics, a now-defunct Massachusetts computer company.

 source

 

Written by LW

March 15, 2018 at 1:01 am

%d bloggers like this: