(Roughly) Daily

Posts Tagged ‘world wide web

“Cyberspace undeniably reflects some form of geography”*…

Your correspondent in stepping again into the rapids, so (Roughly) Daily is going into a short hiatus. Regular service should resume on or around Nov 4. Here, something to enjoy in the meantime…

Our old friend Neal Agarwal has created an interactive museum of sorts, a stroll through the history of the internet, as manifest in the artifacts of important “firsts”– the first smiley, the first MP3, the first “LOL.” the first live-steamed concert, and so, so much more…

Browse through Internet Artifacts, from @nealagarwal.

* Sandra Day O’Connor

###

As we touch the exhibits, we might send imperial birthday greetings to William Henry Gates III; he was born on this date in 1955. Gates is, of course, best known for co-founding the technology giant Microsoft, along with his childhood friend Paul Allen. He led the company from its packaged software beginnings onto the internet. After leaving the company in 2008, he founded several other companies, including BENCascade InvestmentTerraPowerbgC3, and Breakthrough Energy; but he has increasingly turned his attention to philanthropy.

Bill Gates

source

Written by (Roughly) Daily

October 28, 2023 at 1:00 am

“Of course there’s a lot of knowledge in universities: the freshmen bring a little in; the seniors don’t take much away, so knowledge sort of accumulates”*…

Professor Paul Musgrave on the wacky world of university fundraising…

I would like you to buy me a chair. Not just any chair: an endowed chair.

Let me explain.

Universities have strange business models. The legendary University of California president Clark Kerr once quipped that their functions were “To provide sex for the students, sports for the alumni, and parking for the faculty.” These days, the first is laundered for public consumption as “the student experience” and the third is a cost center (yes, many to most professors have to pay, rather a lot, for their parking tags). (The second remains unchanged.)

You can tell that Kerr was president during a time of lavish support because he didn’t include the other function of a university: to provide naming opportunities for donors.

Presidents, chancellors, and provosts seek to finagle gifts because the core business of universities—providing credits to students in exchange for tuition—is both volatile and insufficient to meet the boundless ambitions of administrators and faculty alike. (Faculty might protest that their ambitions are quite modest, as they include merely limitless research budgets and infinite releases from course time—but other than that, they ask only for cost of living adjustments as well as regular salary increases.) Trustees expect presidents to bring in new buildings and new chairs; presidents expect trustees to help dun their friends and acquaintances for donations. The incentives even trickle down to deans, directors, and chairs, all of whom live with increasingly austere baseline budgets and a concomitant incentive to find and cultivate donors to expand, or even just support, their operations.

It’s easy, and wrong, for faculty to be cynical about this. First, these operations reflect the gloriously incongruous medieval nature of the university. Higher education in its upper reaches resembles medieval monasteries, and such monasteries provided not just seclusion and sanctity for their initiates but the possibility of the purchase of virtue for the wealthy. So, too, do universities offer grateful alumni and those sentimental about the generation of knowledge opportunities to turn worldly wealth into tax-deductible noblesse oblige.

Second, donors are the customers for the other product of the university: the social proof of good works. Universities offer donors solicitous for the future of the less fortunate opportunities to subsidize tuition, and they offer donors more interested in the benefits of knowledge the opportunity to subsidize research. The reward comes in some combination of the knowledge that such works are being done and the fact that the donor’s name will be associated with it. (Few large university buildings are named the Anonymous Center for Cancer Research.)

The bar for giving continues to rise. Nine-figure gifts were once unheard of; nowadays, they are striking but no longer unprecedented. For such a sum you can have a constituent college named for yourself. The next frontier must be the billion, or multi-billion, dollar gift. For that level, of course, the reward would have to be commensurate. Given that Harvard was named for a donor who left some books and a few hundred pounds to his eponymous university, one wonders whether someone in Harvard’s charitable receiving arm hasn’t calculated how much it would cost to become, say, the Zuckerberg-Harvard University. (I would wager that an earnest offer of $10 billion would at least raise the issue.)…

[There follows a price list for endowed/named Chairs at different universities, and an analysis of their economics. The author suggest that a chair for him would run $2.5-3 million…]

Fascinating: “Buy Me a Chair,” from @profmusgrave.

* A. Lawrence Lowell (legal scholar and President of Harvard University from 1909 to 1933)

###

As we dig deep, we might recall that it was on this date in 1991 that the World Wide Web was introduced to the world at large.

In 1989, Tim Berners-Lee (now Sir Tim) proposed the system to his colleagues at CERN. He got a working system implemented by the end of 1990, including a browser called WorldWideWeb (which became the name of the project and of the network) and an HTTP server running at CERN. As part of that development, he defined the first version of the HTTP protocol, the basic URL syntax, and implicitly made HTML the primary document format.

The technology was released outside CERN to other research institutions starting in January 1991, and then– with the publication of this (likely the first public) web page— to the whole Internet 32 years ago today. Within the next two years, there were 50 websites created. (Today, while it is understood that the number of active sites fluctuates, the total is estimated at over 1.5 billion.)

The NeXT Computer used by Tim Berners-Lee at CERN that became the world’s first Web server (source)

“Not with a bang, but a whimper”*…

 

automation

 

What actually happens to workers when a company deploys automation? The common assumption seems to be that the employee simply disappears wholesale, replaced one-for-one with an AI interface or an array of mechanized arms.

Yet given the extensive punditeering, handwringing, and stump-speeching around the “robots are coming for our jobs” phenomenon—which I will never miss an opportunity to point out is falsely represented—research into what happens to the individual worker remains relatively thin. Studies have attempted to monitor the impact of automation on wages on aggregate or to correlate employment to levels of robotization.

But few in-depth investigations have been made into what happens to each worker after their companies roll out automation initiatives. Earlier this year, though, a paper authored by economists James Bessen, Maarten Goos, Anna Salomons, and Wiljan Van den Berge set out to do exactly that…

What emerges is a portrait of workplace automation that is ominous in a less dramatic manner than we’re typically made to understand. For one thing, there is no ‘robot apocalypse’, even after a major corporate automation event. Unlike mass layoffs, automation does not appear to immediately and directly send workers packing en masse.

Instead, automation increases the likelihood that workers will be driven away from their previous jobs at the companies—whether they’re fired, or moved to less rewarding tasks, or quit—and causes a long-term loss of wages for the employee.

The report finds that “firm-level automation increases the probability of workers separating from their employers and decreases days worked, leading to a 5-year cumulative wage income loss of 11 percent of one year’s earnings.” That’s a pretty significant loss.

Worse still, the study found that even in the Netherlands, which has a comparatively generous social safety net to, say, the United States, workers were only able to offset a fraction of those losses with benefits provided by the state. Older workers, meanwhile, were more likely to retire early—deprived of years of income they may have been counting on.

Interestingly, the effects of automation were felt similarly through all manner of company—small, large, industrial, services-oriented, and so on. The study covered all non-finance sector firms, and found that worker separation and income loss were “quite pervasive across worker types, firm sizes and sectors.”

Automation, in other words, forces a more pervasive, slower-acting and much less visible phenomenon than the robots-are-eating-our-jobs talk is preparing us for…

The result, Bessen says, is an added strain on the social safety net that it is currently woefully unprepared to handle. As more and more firms join the automation goldrush—a 2018 McKinsey survey of 1,300 companies worldwide found that three-quarters of them had either begun to automate business processes or planned to do so next year—the number of workers forced out of firms seems likely to tick up, or at least hold steady. What is unlikely to happen, per this research, is an automation-driven mass exodus of jobs.

This is a double-edged sword: While it’s obviously good that thousands of workers are unlikely to be fired in one fell swoop when a process is automated at a corporation, it also means the pain of automation is distributed in smaller, more personalized doses, and thus less likely to prompt any sort of urgent public response. If an entire Amazon warehouse were suddenly automated, it might spur policymakers to try to address the issue; if automation has been slowly hurting us for years, it’s harder to rally support for stemming the pain…

Brian Merchant on the ironic challenge of addressing the slow-motion, trickle-down social, economic, and cultural threats of automation– that they will accrue gradually, like erosion, not catastrophically… making it harder to generate a sense of urgency around creating a response: “There’s an Automation Crisis Underway Right Now, It’s Just Mostly Invisible.”

* T. S. Eliot, “The Hollow Men”

###

As we think systemically, we might recall that it was on this date in 1994 that Ken McCarthy, Marc Andreessen, and Mark Graham held the first conference to focus on the commercial potential of the World Wide Web.

 

 

Written by (Roughly) Daily

November 5, 2019 at 1:01 am

“Outward show is a wonderful perverter of the reason”*…

 

facial analysis

Humans have long hungered for a short-hand to help in understanding and managing other humans.  From phrenology to the Myers-Briggs Test, we’ve tried dozens of short-cuts… and tended to find that at best they weren’t actually very helpful; at worst, they were reinforcing of stereotypes that were inaccurate, and so led to results that were unfair and ineffective.  Still, the quest continues– these days powered by artificial intelligence.  What could go wrong?…

Could a program detect potential terrorists by reading their facial expressions and behavior? This was the hypothesis put to the test by the US Transportation Security Administration (TSA) in 2003, as it began testing a new surveillance program called the Screening of Passengers by Observation Techniques program, or Spot for short.

While developing the program, they consulted Paul Ekman, emeritus professor of psychology at the University of California, San Francisco. Decades earlier, Ekman had developed a method to identify minute facial expressions and map them on to corresponding emotions. This method was used to train “behavior detection officers” to scan faces for signs of deception.

But when the program was rolled out in 2007, it was beset with problems. Officers were referring passengers for interrogation more or less at random, and the small number of arrests that came about were on charges unrelated to terrorism. Even more concerning was the fact that the program was allegedly used to justify racial profiling.

Ekman tried to distance himself from Spot, claiming his method was being misapplied. But others suggested that the program’s failure was due to an outdated scientific theory that underpinned Ekman’s method; namely, that emotions can be deduced objectively through analysis of the face.

In recent years, technology companies have started using Ekman’s method to train algorithms to detect emotion from facial expressions. Some developers claim that automatic emotion detection systems will not only be better than humans at discovering true emotions by analyzing the face, but that these algorithms will become attuned to our innermost feelings, vastly improving interaction with our devices.

But many experts studying the science of emotion are concerned that these algorithms will fail once again, making high-stakes decisions about our lives based on faulty science…

“Emotion detection” has grown from a research project to a $20bn industry; learn more about why that’s a cause for concern: “Don’t look now: why you should be worried about machines reading your emotions.”

* Marcus Aurelius, Meditations

###

As we insist on the individual, we might recall that it was on this date in 1989 that Tim Berners-Lee submitted a proposal to CERN for developing a new way of linking and sharing information over the Internet.

It was the first time Berners-Lee proposed a system that would ultimately become the World Wide Web; but his proposal was basically a relatively vague request to research the details and feasibility of such a system.  He later submitted a proposal on November 12, 1990 that much more directly detailed the actual implementation of the World Wide Web.

web25-significant-white-300x248 source

 

Written by (Roughly) Daily

March 12, 2019 at 12:01 am

“Printing…is the preservative of all arts”*…

 

dunhuang-diamond-sutra-frontispiece

Frontispiece of the Dunhuang Diamond Sūtra

 

In 366, the itinerant monk Yuezun was wandering through the arid landscape [around the Western Chinese city of Dunhuang] when a fantastical sight appeared before him: a thousand buddhas, bathed in golden light. (Whether heat, exhaustion or the strange voice of the sands worked themselves on his imagination is anyone’s guess.) Awed by his vision, Yuezun took up hammer and chisel and carved a devotional space into a nearby cliff-face. It soon became a centre for religion and art: Dunhuang was situated at the confluence of two major Silk Road routes, and both departing and returning merchants made offerings. By the time the site fell into disuse in the 14th century, almost 500 temples had been carved from the cliff.

Among the hundreds of caves was a chamber that served as a storeroom for books. The Library Cave held more than 50,000 texts: religious tracts, business reports, calendars, dictionaries, government documents, shopping lists, and the oldest dated printed book in the world. A colophon at the end of the Dunhuang Diamond Sūtra scroll dates it to 868, nearly six centuries before the first Gutenberg Bible…

Learn more at: “The Oldest Printed Book in the World.”  Then page through the British Libraries digitization of its restoration.

* Isaiah Thomas (the 19th century publisher and author, not the basketball player)

###

As we treasure tomes, we might recall that it was on this date in 1990 that  Tim Berners-Lee published a formal proposal for aa “Hypertext project” that he called the World Wide Web (though at the time he rendered it in one word: “WorldWideWeb”)… laying the foundation for a network that has become central to the information age– a network that, with its connected technologies, is believed by many to have sparked a revolution as fundamental and impactful as the revolution ignited by Gutenberg and moveable type.

Sir_Tim_Berners-Lee_(cropped) source

 

Written by (Roughly) Daily

November 12, 2018 at 1:01 am