Posts Tagged ‘Tim Berners-Lee’
“Of course there’s a lot of knowledge in universities: the freshmen bring a little in; the seniors don’t take much away, so knowledge sort of accumulates”*…

Professor Paul Musgrave on the wacky world of university fundraising…
I would like you to buy me a chair. Not just any chair: an endowed chair.
Let me explain.
Universities have strange business models. The legendary University of California president Clark Kerr once quipped that their functions were “To provide sex for the students, sports for the alumni, and parking for the faculty.” These days, the first is laundered for public consumption as “the student experience” and the third is a cost center (yes, many to most professors have to pay, rather a lot, for their parking tags). (The second remains unchanged.)
You can tell that Kerr was president during a time of lavish support because he didn’t include the other function of a university: to provide naming opportunities for donors.
…
Presidents, chancellors, and provosts seek to finagle gifts because the core business of universities—providing credits to students in exchange for tuition—is both volatile and insufficient to meet the boundless ambitions of administrators and faculty alike. (Faculty might protest that their ambitions are quite modest, as they include merely limitless research budgets and infinite releases from course time—but other than that, they ask only for cost of living adjustments as well as regular salary increases.) Trustees expect presidents to bring in new buildings and new chairs; presidents expect trustees to help dun their friends and acquaintances for donations. The incentives even trickle down to deans, directors, and chairs, all of whom live with increasingly austere baseline budgets and a concomitant incentive to find and cultivate donors to expand, or even just support, their operations.
It’s easy, and wrong, for faculty to be cynical about this. First, these operations reflect the gloriously incongruous medieval nature of the university. Higher education in its upper reaches resembles medieval monasteries, and such monasteries provided not just seclusion and sanctity for their initiates but the possibility of the purchase of virtue for the wealthy. So, too, do universities offer grateful alumni and those sentimental about the generation of knowledge opportunities to turn worldly wealth into tax-deductible noblesse oblige.
Second, donors are the customers for the other product of the university: the social proof of good works. Universities offer donors solicitous for the future of the less fortunate opportunities to subsidize tuition, and they offer donors more interested in the benefits of knowledge the opportunity to subsidize research. The reward comes in some combination of the knowledge that such works are being done and the fact that the donor’s name will be associated with it. (Few large university buildings are named the Anonymous Center for Cancer Research.)
…
The bar for giving continues to rise. Nine-figure gifts were once unheard of; nowadays, they are striking but no longer unprecedented. For such a sum you can have a constituent college named for yourself. The next frontier must be the billion, or multi-billion, dollar gift. For that level, of course, the reward would have to be commensurate. Given that Harvard was named for a donor who left some books and a few hundred pounds to his eponymous university, one wonders whether someone in Harvard’s charitable receiving arm hasn’t calculated how much it would cost to become, say, the Zuckerberg-Harvard University. (I would wager that an earnest offer of $10 billion would at least raise the issue.)…
[There follows a price list for endowed/named Chairs at different universities, and an analysis of their economics. The author suggest that a chair for him would run $2.5-3 million…]
Fascinating: “Buy Me a Chair,” from @profmusgrave.
* A. Lawrence Lowell (legal scholar and President of Harvard University from 1909 to 1933)
###
As we dig deep, we might recall that it was on this date in 1991 that the World Wide Web was introduced to the world at large.
In 1989, Tim Berners-Lee (now Sir Tim) proposed the system to his colleagues at CERN. He got a working system implemented by the end of 1990, including a browser called WorldWideWeb (which became the name of the project and of the network) and an HTTP server running at CERN. As part of that development, he defined the first version of the HTTP protocol, the basic URL syntax, and implicitly made HTML the primary document format.
The technology was released outside CERN to other research institutions starting in January 1991, and then– with the publication of this (likely the first public) web page— to the whole Internet 32 years ago today. Within the next two years, there were 50 websites created. (Today, while it is understood that the number of active sites fluctuates, the total is estimated at over 1.5 billion.)

“A nothing will serve just as well as a something about which nothing could be said”*…
Metaphysical debates in quantum physics don’t get at “truth,” physicist and mathematician Timothy Andersen argues; they’re nothing but a form of ritual activity and culture. After a thoughtful intellectual history of both quantum mechanics and Wittgenstein’s thought, he concludes…
If Wittgenstein were alive today, he might have couched his arguments in the vocabulary of cultural anthropology. For this shared grammar and these language games, in his view, form part of much larger ritualistic mechanisms that connect human activity with human knowledge, as deeply as DNA connects to human biology. It is also a perfect example of how evolution works by using pre-existing mechanisms to generate new behaviors.
The conclusion from all of this is that interpretation and representation in language and mathematics are little different than the supernatural explanations of ancient religions. Trying to resolve the debate between Bohr and Einstein is like trying to answer the Zen kōan about whether the tree falling in the forest makes a sound if no one can hear it. One cannot say definitely yes or no, because all human language must connect to human activity. And all human language and activity are ritual, signifying meaning by their interconnectedness. To ask what the wavefunction means without specifying an activity – and experiment – to extract that meaning is, therefore, as sensible as asking about the sound of the falling tree. It is nonsense.
As a scientist and mathematician, Wittgenstein has challenged my own tendency to seek out interpretations of phenomena that have no scientific value – and to see such explanations as nothing more than narratives. He taught that all that philosophy can do is remind us of what is evidently true. It’s evidently true that the wavefunction has a multiverse interpretation, but one must assume the multiverse first, since it cannot be measured. So the interpretation is a tautology, not a discovery.
I have humbled myself to the fact that we can’t justify clinging to one interpretation of reality over another. In place of my early enthusiastic Platonism, I have come to think of the world not as one filled with sharply defined truths, but rather as a place containing myriad possibilities – each of which, like the possibilities within the wavefunction itself, can be simultaneously true. Likewise, mathematics and its surrounding language don’t represent reality so much as serve as a trusty tool for helping people to navigate the world. They are of human origin and for human purposes.
To shut up and calculate, then, recognizes that there are limits to our pathways for understanding. Our only option as scientists is to look, predict and test. This might not be as glamorous an offering as the interpretations we can construct in our minds, but it is the royal road to real knowledge…
A provocative proposition: “Quantum Wittgenstein,” from @timcopia in @aeonmag.
* Ludwig Wittgenstein, Philosophical Investigations
###
As we muse on meaning, we might recall that it was on this date in 1954 that the official ground-breaking for CERN (Conseil européen pour la recherche nucléaire) was held. Located in Switzerland, it is the largest particle physics laboratory in the world… that’s to say, a prime spot to do the observation and calculation that Andersen suggests. Indeed, it’s been the site of many breakthrough discoveries over the years, maybe most notably the 2012 observation of the Higgs Boson.
Because researchers need remote access to these facilities, the lab has historically been a major wide area network hub. Indeed, it was at CERN that Tim Berners-Lee developed the first “browser”– and effectively fomented the emergence of the web.
“Outward show is a wonderful perverter of the reason”*…
Humans have long hungered for a short-hand to help in understanding and managing other humans. From phrenology to the Myers-Briggs Test, we’ve tried dozens of short-cuts… and tended to find that at best they weren’t actually very helpful; at worst, they were reinforcing of stereotypes that were inaccurate, and so led to results that were unfair and ineffective. Still, the quest continues– these days powered by artificial intelligence. What could go wrong?…
Could a program detect potential terrorists by reading their facial expressions and behavior? This was the hypothesis put to the test by the US Transportation Security Administration (TSA) in 2003, as it began testing a new surveillance program called the Screening of Passengers by Observation Techniques program, or Spot for short.
While developing the program, they consulted Paul Ekman, emeritus professor of psychology at the University of California, San Francisco. Decades earlier, Ekman had developed a method to identify minute facial expressions and map them on to corresponding emotions. This method was used to train “behavior detection officers” to scan faces for signs of deception.
But when the program was rolled out in 2007, it was beset with problems. Officers were referring passengers for interrogation more or less at random, and the small number of arrests that came about were on charges unrelated to terrorism. Even more concerning was the fact that the program was allegedly used to justify racial profiling.
Ekman tried to distance himself from Spot, claiming his method was being misapplied. But others suggested that the program’s failure was due to an outdated scientific theory that underpinned Ekman’s method; namely, that emotions can be deduced objectively through analysis of the face.
In recent years, technology companies have started using Ekman’s method to train algorithms to detect emotion from facial expressions. Some developers claim that automatic emotion detection systems will not only be better than humans at discovering true emotions by analyzing the face, but that these algorithms will become attuned to our innermost feelings, vastly improving interaction with our devices.
But many experts studying the science of emotion are concerned that these algorithms will fail once again, making high-stakes decisions about our lives based on faulty science…
“Emotion detection” has grown from a research project to a $20bn industry; learn more about why that’s a cause for concern: “Don’t look now: why you should be worried about machines reading your emotions.”
Meditations
###
As we insist on the individual, we might recall that it was on this date in 1989 that Tim Berners-Lee submitted a proposal to CERN for developing a new way of linking and sharing information over the Internet.
It was the first time Berners-Lee proposed a system that would ultimately become the World Wide Web; but his proposal was basically a relatively vague request to research the details and feasibility of such a system. He later submitted a proposal on November 12, 1990 that much more directly detailed the actual implementation of the World Wide Web.
“Printing…is the preservative of all arts”*…

Frontispiece of the Dunhuang Diamond Sūtra
In 366, the itinerant monk Yuezun was wandering through the arid landscape [around the Western Chinese city of Dunhuang] when a fantastical sight appeared before him: a thousand buddhas, bathed in golden light. (Whether heat, exhaustion or the strange voice of the sands worked themselves on his imagination is anyone’s guess.) Awed by his vision, Yuezun took up hammer and chisel and carved a devotional space into a nearby cliff-face. It soon became a centre for religion and art: Dunhuang was situated at the confluence of two major Silk Road routes, and both departing and returning merchants made offerings. By the time the site fell into disuse in the 14th century, almost 500 temples had been carved from the cliff.
Among the hundreds of caves was a chamber that served as a storeroom for books. The Library Cave held more than 50,000 texts: religious tracts, business reports, calendars, dictionaries, government documents, shopping lists, and the oldest dated printed book in the world. A colophon at the end of the Dunhuang Diamond Sūtra scroll dates it to 868, nearly six centuries before the first Gutenberg Bible…
Learn more at: “The Oldest Printed Book in the World.” Then page through the British Libraries digitization of its restoration.
* Isaiah Thomas (the 19th century publisher and author, not the basketball player)
###
As we treasure tomes, we might recall that it was on this date in 1990 that Tim Berners-Lee published a formal proposal for aa “Hypertext project” that he called the World Wide Web (though at the time he rendered it in one word: “WorldWideWeb”)… laying the foundation for a network that has become central to the information age– a network that, with its connected technologies, is believed by many to have sparked a revolution as fundamental and impactful as the revolution ignited by Gutenberg and moveable type.
“If you like overheads, you’ll love PowerPoint”*…
Military Industrial Powerpoint Complex is collection created as a special project for the Internet Archive’s 20th Anniversary celebration in 2016, highlighting IA’s web archive. It consists of all the Powerpoint files (57,489) from the .mil web domain, e,g,:
Plumb the depths at The Military Industrial Powerpoint Complex.
* Edward Tufte
###
As we hold our heads in our hands, we might recall that it was on this date in 1989 that Tim Berners-Lee submitted a proposal to CERN for developing a new way of linking and sharing information over the Internet. It was the first time Berners-Lee proposed the system that would ultimately become the World Wide Web, so this date is oft cited as the “Birthday of the Web.” But his pitch was a bit vague, and got no traction. He resubmitted a second, more detailed proposal on November 12, 1990– on which CERN acted… so many consider this later date the Web’s inception.
You must be logged in to post a comment.