(Roughly) Daily

Posts Tagged ‘Technology

“History does not repeat itself. The historians repeat one another.”*…

 

The Course of Empire: Destruction, 1836 (oil on canvas)

Thomas Cole: “The Course of Empire: Destruction” (1836)

 

Your correspondent is headed to the steamy Southeast for his annual communion with surf, sand, and delicacies of the deep-fried variety.  Regular service will resume on or around August 26.  By way of hopping into hiatus on a high note…

The conviction that Trump is single-handedly tipping the United States into a crisis worthy of the Roman Empire at its most decadent has been a staple of jeremiads ever since his election, but fretting whether it is the fate of the United States in the twenty-first century to ape Rome by subsiding into terminal decay did not begin with his presidency. A year before Trump’s election, the distinguished Harvard political scientist Joseph Nye was already glancing nervously over his shoulder at the vanished empire of the Caesars: “Rome rotted from within when people lost confidence in their culture and institutions, elites battled for control, corruption increased and the economy failed to grow adequately.” Doom-laden prophecies such as these, of decline and fall, are the somber counterpoint to the optimism of the American Dream.

And so they have always been.  At various points in American history, various reasons have been advanced to explain why the United States is bound to join the Roman Empire in oblivion…

Tom Holland compares and contrasts (very engagingly) the late history of the Roman Empire with that of the U.S., and (very amusingly) second-century Emperor Commodus with Donald Trump; he concludes:

History serves as only the blindest and most stumbling guide to the future. America is not Rome. Donald Trump is not Commodus. There is nothing written into the DNA of a superpower that says that it must inevitably decline and fall. This is not an argument for complacency; it is an argument against despair. Americans have been worrying about the future of their republic for centuries now. There is every prospect that they will be worrying about it for centuries more.

Enjoy the essay in full: “America Is Not Rome. It Just Thinks It Is.

* Max Beerbohm

###

As we recognize that this doesn’t actually mean that we can breathe any easier, we might send fantastically far-sighted birthday greetings to Hugo Gernsback, a Luxemborgian-American inventor, broadcast pioneer, writer, and publisher; he was born on this date in 1884.

Gernsback held 80 patents at the time of his death; he founded radio station WRNY, was involved in the first television broadcasts, and is considered a pioneer in amateur radio.  But it was as a writer and publisher that he probably left his most lasting mark:  In 1926, as owner/publisher of the magazine Modern Electrics, he filled a blank spot in his publication by dashing off the first chapter of a series called “Ralph 124C 41+.” The twelve installments of “Ralph” were filled with inventions unknown in 1926, including “television” (Gernsback is credited with introducing the word), fluorescent lighting, juke boxes, solar energy, television, microfilm, vending machines, and the device we now call radar.

The “Ralph” series was an astounding success with readers; and later that year Gernsback founded the first magazine devoted to science fiction, Amazing Stories.  Believing that the perfect sci-fi story is “75 percent literature interwoven with 25 percent science,” he coined the term “science fiction.”

Gernsback was a “careful” businessman, who was tight with the fees that he paid his writers– so tight that H. P. Lovecraft and Clark Ashton Smith referred to him as “Hugo the Rat.”

Still, his contributions to the genre as publisher were so significant that, along with H.G. Wells and Jules Verne, he is sometimes called “The Father of Science Fiction”; in his honor, the annual Science Fiction Achievement awards are called the “Hugos.”

(Coincidentally, today is also the birthday– in 1906– of Philo T. Farnsworth, the man who actually did invent television… and was thus the inspiration for the name “Philco.”)

Gernsback, wearing one of his inventions, TV Glasses

source

 

Written by LW

August 16, 2019 at 1:01 am

“Central planning didn’t work for Stalin or Mao, and it won’t work for an entrepreneur either”*…

 

Amazon central planning

 

Applying science to social problems has brought huge dividends in the past. Long before the invention of the silicon chip, medical and technological innovations had already made our lives far more comfortable – and longer. But history is also replete with disasters caused by the power of science and the zeal to improve the human condition.

For example, efforts to boost agricultural yields through scientific or technological augmentation in the context of collectivization in the Soviet Union or Tanzania backfired spectacularly. Sometimes, plans to remake cities through modern urban planning all but destroyed them. The political scientist James Scott has dubbed such efforts to transform others’ lives through science instances of “high modernism.”An ideology as dangerous as it is dogmatically overconfident, high modernism refuses to recognize that many human practices and behaviors have an inherent logic that is adapted to the complex environment in which they have evolved. When high modernists dismiss such practices in order to institute a more scientific and rational approach, they almost always fail.

Historically, high-modernist schemes have been most damaging in the hands of an authoritarian state seeking to transform a prostrate, weak society. In the case of Soviet collectivization, state authoritarianism originated from the self-proclaimed “leading role” of the Communist Party, and pursued its schemes in the absence of any organizations that could effectively resist them or provide protection to peasants crushed by them.

Yet authoritarianism is not solely the preserve of states. It can also originate from any claim to unbridled superior knowledge or ability. Consider contemporary efforts by corporations, entrepreneurs, and others who want to improve our world through digital technologies. Recent innovations have vastly increased productivity in manufacturing, improved communication, and enriched the lives of billions of people. But they could easily devolve into a high-modernist fiasco…

But this characteristically high-modernist path is not preordained. Instead of ignoring social context, those developing new technologies could actually learn something from the experiences and concerns of real people. The technologies themselves could be adaptive rather than hubristic, designed to empower society rather than silence it.Two forces are likely to push new technologies in this direction. The first is the market, which may act as a barrier against misguided top-down schemes. Once Soviet planners decided to collectivize agriculture, Ukrainian villagers could do little to stop them. Mass starvation ensued. Not so with today’s digital technologies, the success of which will depend on decisions made by billions of consumers and millions of businesses around the world (with the possible exception of those in China)…

That said, the power of the market constraint should not be exaggerated. There is no guarantee that the market will select the right technologies for widespread adoption, nor will it internalize the negative effects of some new applications. The fact that Facebook exists and collects information about its 2.5 billion active users in a market environment does not mean we can trust how it will use that data. The market certainly doesn’t guarantee that there won’t be unforeseen consequences from Facebook’s business model and underlying technologies.

For the market constraint to work, it must be bolstered by a second, more powerful check: democratic politics. Every state has a proper role to play in regulating economic activity and the use and spread of new technologies. Democratic politics often drives the demand for such regulation. It is also the best defense against the capture of state policies by rent-seeking businesses attempting to raise their market shares or profits.

Democracy also provides the best mechanism for airing diverse viewpoints and organizing resistance to costly or dangerous high-modernist schemes. By speaking out, we can slow down or even prevent the most pernicious applications of surveillance, monitoring, and digital manipulation. A democratic voice is precisely what was denied to Ukrainian and Tanzanian villagers confronted with collectivization schemes.But regular elections are not sufficient to prevent Big Tech from creating a high-modernist nightmare. Insofar as new technologies can thwart free speech and political compromise and deepen concentrations of power in government or the private sector, they can frustrate the workings of democratic politics itself, creating a vicious circle. If the tech world chooses the high-modernist path, it may ultimately damage our only reliable defense against its hubris: democratic oversight of how new technologies are developed and deployed. We as consumers, workers, and citizens should all be more cognizant of the threat, for we are the only ones who can stop it.

At the same time that science and technology have vastly improved human lives, they have also given certain visionaries the means to transform entire societies from above. Ominously, what was true of Soviet central planners is true of Big Tech today: namely, the assumption that society can be improved through pure “rationality.”  Daron Acemoglu— Professor of Economics at MIT,  and co-author (with James A. Robinson) of Why Nations Fail: The Origins of Power, Prosperity and Poverty and The Narrow Corridor: States, Societies, and the Fate of Liberty (forthcoming from Penguin Press in September 2019)– explains: “Big Tech’s Harvest of Sorrow?

[Illustration above from “The Singular Pursuit of Comrade Bezos,” also worth a read.]

* Michael Bloomberg

###

As we rethink “reason,” we might recall that it was on this date in 1894 (one year before Marconi’s first demo) that distinguished physicist Oliver Lodge achieved the first successful radio transmission of information via Morse Code in a presentation to the British Association for the Advancement of Science at a meeting in Oxford.  He sent a message about 50 meters from the old Clarendon Laboratory to the lecture theater of the University Museum.

Lodge continued to develop his approach, and patented several elements of it… running into intellectual property disputes with Marconi.  Finally, in 1912, Lodge, at heart an academic, sold his patents to the more determinedly-commercial Marconi.

220px-Oliver_Joseph_Lodge3 source

 

Written by LW

August 14, 2019 at 1:01 am

“A better world won’t come about simply because we use data; data has its dark underside.”*…

 

Data

 

Data isn’t the new oil, it’s the new CO2. It’s a common trope in the data/tech field to say that “data is the new oil”. The basic idea being – it’s a new resource that is being extracted, it is valuable, and is a raw product that fuels other industries. But it also implies that data in inherently valuable in and of itself and that “my data” is valuable, a resource that I really should tap in to.

In reality, we are more impacted by other people’s data (with whom we are grouped) than we are by data about us. As I have written in the MIT Technology Review – “even if you deny consent to ‘your’ data being used, an organisation can use data about other people to make statistical extrapolations that affect you.” We are bound by other people’s consent. Our own consent (or lack thereof) is becoming increasingly irrelevant. We won’t solve the societal problems pervasive data surveillance is causing by rushing through online consent forms. If you see data as CO2, it becomes clearer that its impacts are societal not solely individual. My neighbour’s car emissions, the emissions from a factory on a different continent, impact me more than my own emissions or lack thereof. This isn’t to abdicate individual responsibility or harm. It’s adding a new lens that we too often miss entirely.

We should not endlessly be defending arguments along the lines that “people choose to willingly give up their freedom in exchange for free stuff online”. The argument is flawed for two reasons. First the reason that is usually given – people have no choice but to consent in order to access the service, so consent is manufactured.  We are not exercising choice in providing data but rather resigned to the fact that they have no choice in the matter.

The second, less well known but just as powerful, argument is that we are not only bound by other people’s data; we are bound by other people’s consent.  In an era of machine learning-driven group profiling, this effectively renders my denial of consent meaningless. Even if I withhold consent, say I refuse to use Facebook or Twitter or Amazon, the fact that everyone around me has joined means there are just as many data points about me to target and surveil. The issue is systemic, it is not one where a lone individual can make a choice and opt out of the system. We perpetuate this myth by talking about data as our own individual “oil”, ready to sell to the highest bidder. In reality I have little control over this supposed resource which acts more like an atmospheric pollutant, impacting me and others in myriads of indirect ways. There are more relations – direct and indirect – between data related to me, data about me, data inferred about me via others than I can possibly imagine, let alone control with the tools we have at our disposal today.

Because of this, we need a social, systemic approach to deal with our data emissions. An environmental approach to data rights as I’ve argued previously. But first let’s all admit that the line of inquiry defending pervasive surveillance in the name of “individual freedom” and individual consent gets us nowhere closer to understanding the threats we are facing.

Martin Tisné argues for an “environmental” approach to data rights: “Data isn’t the new oil, it’s the new CO2.”

Lest one think that we couldn’t/shouldn’t have seen this (and related issues like over dependence on algorithms, the digital divide, et al.) coming, see also Paul Baran‘s prescient 1968 essay, “On the Future Computer Era,” one of the last pieces he did at RAND, before co-leading the spin-off of The Institute for the Future.

* Mike Loukides, Ethics and Data Science

###

As we ponder privacy, we might recall that it was on this date in 1981 that IBM released IBM model number 5150– AKA the IBM PC– the original version and progenitor of the IBM PC compatible hardware platform. Since the machine was based on open architecture, within a short time of its introduction, third-party suppliers of peripheral devices, expansion cards, and software proliferated; the influence of the IBM PC on the personal computer market was substantial in standardizing a platform for personal computers (and creating a market for Microsoft’s operating system– first PC DOS, then Windows– on which the PC platform ran).  “IBM compatible” became an important criterion for sales growth; after the 1980s, only the Apple Macintosh family kept a significant share of the microcomputer market without compatibility with the IBM personal computer.

IBM PC source

 

Written by LW

August 12, 2019 at 1:01 am

“Home’s where you go when you run out of homes”*…

 

home

 

When we imagine the homes of the future, we can’t just think about the technologies that could alter our domestic lives. We also need to think about the changing ways that people relate to their habitats.

For the past five years, Ikea has been on a mission to better understand people’s relationships with their homes by doing in-depth sociological studies of its consumers. The company publishes its finding in its annual Life at Home report, which began in 2014. Last year’s report involved visiting the houses and apartments of 22,000 people across 22 countries to better understand what everyday living looks like in today’s world.

What Ikea found was that our fundamental notions of home and family are experiencing a transformation. Plenty of demographic research suggests that major changes in where and how we live could be afoot: For instance, people who marry later may spend more years living with roommates. If couples delay having children—or choose to remain child-free—they may choose to live longer in smaller apartments. As people live longer, we might find more multigenerational homes, as parents, children, and grandchildren all cohabit under one roof.In addition to those demographic shifts, Ikea’s research uncovered something else: Many of the people in its large study were not particularly satisfied with their domestic life. For one thing, they’re increasingly struggling to feel a sense of home in the places they live; 29% of people surveyed around the world felt more at home in other places than the space where they live every day. A full 35% of people in cities felt this way.

Ikea surveyed 22,000 people in 22 countries, and came up with six visions for the future of our homes: “See Ikea’s 6 visions for how we’ll live in the future.”

* John le Carré, The Honourable Schoolboy

###

As we settle in, we might send pointed birthday greeting to Nicolas-Jacques Conté; he was born on this date in 1755.  A painter, balloonist, and army officer, he is best remembered as the inventor of the modern pencil.  At a time when the French Republic was at that time under economic blockade and unable to import graphite from Great Britain, its main source of the material, Conté was asked by Lazare Nicolas Marguerite Carnot to create an alternative.  Conté mixed powdered graphite with clay and pressed the material between two half-cylinders of wood– forming the first the modern pencil. He received a patent for the invention in 1795, and formed la Société Conté to make them.  He also invented the conté crayon (named after him), a hard pastel stick used by artists.

220px-Nicolas-Jacques_Conté source

 

 

Written by LW

August 4, 2019 at 1:01 am

“A classical computation is like a solo voice—one line of pure tones succeeding each other. A quantum computation is like a symphony—many lines of tones interfering with one another.”*…

 

abstractions-a-419

 

Quantum computers will never fully replace “classical” ones like the device you’re reading this article on. They won’t run web browsers, help with your taxes, or stream the latest video from Netflix.

What they will do—what’s long been hoped for, at least—will be to offer a fundamentally different way of performing certain calculations. They’ll be able to solve problems that would take a fast classical computer billions of years to perform. They’ll enable the simulation of complex quantum systems such as biological molecules, or offer a way to factor incredibly large numbers, thereby breaking long-standing forms of encryption.

The threshold where quantum computers cross from being interesting research projects to doing things that no classical computer can do is called “quantum supremacy.” Many people believe that Google’s quantum computing project will achieve it later this year…

Researchers are getting close to building a quantum computer that can perform tasks a classical computer can’t. Here’s what the milestone will mean: “Quantum Supremacy Is Coming: Here’s What You Should Know.”

* Seth Lloyd, Programming the Universe: A Quantum Computer Scientist Takes on the Cosmos

###

As we get weird, we might recall that it was on this date in 2012 that Ohioan Beth Johnson attempted to break a record that has been set in on this same date 1999 by a group of English college students– for the largest working yoyo in the world.  The British yoyo was 10 feet in diameter; hers, 11 feet, 9 inches.  (It weighed 4,620 lbs.)  Her attempt on this date failed, as did another.  But finally, in September, 2012, she was able successfully to deploy it from a crane in Cincinnati… and earn her way into the Guinness Book of World Records

ss-140909-guinness-08.fit-660w

Beth Johnson and her record-setting creation

source

 

“Of course our lives are regulated. When you come to a stop sign, you stop; if you want to go fishing, you get a license; if you want to shoot ducks, you can shoot only three ducks. The alternative is dead bodies at the intersection, no fish, and no ducks. OK?”*…

 

Regulation

 

After a characteristically-clear explanation of the ways in which the “monopoly practice” concerns around Google, Amazon, and the other on-line giants are different from those the U.S. has traditionally tried to manage– they limit/manage choice– the ever-illuminating Tim O’Reilly argues for a fresh approach to anti-trust:

So how are we therefore best to decide if these Big Tech platforms need to be regulated?

In one famous exchange, Bill Gates, the founder and former CEO of Microsoft, told Chamath Palihapitiya, the one-time head of the Facebook platform:

“This isn’t a platform. A platform is when the economic value of everybody that uses it, exceeds the value of the company that creates it. Then it’s a platform.”

Given this understanding of the role of a platform, regulators should be looking to measure whether companies like Amazon or Google are continuing to provide opportunity for their ecosystem of suppliers, or if they’re increasing their own returns at the expense of that ecosystem.

Rather than just asking whether consumers benefit in the short term from the companies’ actions, regulators should be looking at the long-term health of the marketplace of suppliers—they are the real source of that consumer benefit, not the platforms alone. Have Amazon, Apple, or Google

earned

their profits, or are they coming from monopolistic rents?

How might we know whether a company operating an algorithmically managed marketplace is extracting rents rather than simply taking a reasonable cut for the services it provides? The first sign may not be that it is raising prices for consumers, but that it is taking a larger percentage from its suppliers, or competing unfairly with them.

Before antitrust authorities look to remedies like breaking up these companies, a good first step would be to require disclosure of information about the growth and health of the supply side of their marketplaces. The statistics about the growth of its third-party marketplace that Bezos trumpeted in his shareholder letter tell only half the story. The questions to ask are who profits, by how much, and how that allocation of rewards is changing over time…

Data is the currency of these companies. It should also be the currency of those looking to regulate them. You cannot regulate what you don’t understand. The algorithms that these companies use may be defended as trade secrets, but their outcomes should be open to inspection.

An important read: “Antitrust regulators are using the wrong tools to break up Big Tech.”

* Molly Ivins

###

As we bust trusts, we might recall that it was on this date in 1974 that the Supreme Court handed down its unanimous decision in United States v. Nixon, ordering him to deliver tape recordings and other subpoenaed materials to a federal district court.  Special prosecutor Leon Jaworski had subpoenaed the tapes as part of on-going impeachment proceedings; the White House had sued to quash; and the decision is widely viewed as a crucial precedent limiting the power of any U.S. president to claim executive privilege.

nixon_sony source

 

 

Written by LW

July 24, 2019 at 1:01 am

“There is no sincerer love than the love of food”*…

 

French food

 

As the rest of the world had begun to (re)discover their own cuisines and innovate, the French restaurant seemed to be stagnating in a pool of congealing demi-glace.

Elsewhere, places such as Balthazar in New York and the Wolseley in London seemed to be doing the French restaurant better than the French. In France, the old guard of critics and restaurateurs remained convinced that French cuisine was still the best in the world and a point of national pride. The bistros cleaved to the traditional red-and-white checked table cloths and chalked-up menus even as they were microwaving pre-prepared boeuf bourguignon in the back. In 2010, when the French restaurant meal was added to Unesco’s list of the world’s “intangible cultural heritage”, it felt as if the French restaurant had become a museum piece, and a parody of itself.

The perceived excellence of their cuisine and restaurants has long represented a vital part of French national identity. It was too easy to ascribe this decline to a certain national conservatism, complacency and parochialism – facile Anglo-Saxon taunts. The real story is more complicated. The restaurant business always has been subject to changes in society and economic circumstances. Food – what we eat and how we go out to eat it – is constantly evolving, according to trend and time…

French food was the envy of the world – before it became trapped by its own history.  Can a new school of traditionalists revive its glories? “The rise and fall of French cuisine.”

* George Bernard Shaw

###

As we ponder prandial progress, we might recall that it was on this date in 1904 (as the Library of Congress notes) that the first ice cream cone was served.

On July 23, 1904, according to some accounts, Charles E. Menches conceived the idea of filling a pastry cone with two scoops of ice-cream and thereby invented the ice-cream cone. He is one of several claimants to that honor: Ernest Hamwi, Abe Doumar, Albert and Nick Kabbaz, Arnold Fornachou, and David Avayou all have been touted as the inventor(s) of the first edible cone. Interestingly, these individuals have in common the fact that they all made or sold confections at the 1904 Louisiana Purchase Exposition, known as the St. Louis World’s Fair. It is from the time of the Fair that the edible “cornucopia,” a cone made from a rolled waffle, vaulted into popularity in the United States.

Another claimant, Italo Marchiony, actually received a patent in 1903 for a device to make edible cups with handles. However the patent drawings show the device as a molded container rather than the rolled waffle seen at the Fair. Although paper and metal cones were used by Europeans to hold ice cream and pita bread was used by Middle Easterners to hold sweets, the ice-cream cone seems to have come to America by way of “the Pike” (as the entertainment midway of the St. Louis World’s Fair was called).

Randolph Smith Lyon, Mildred Frances Lyon, Mrs. Montague Lyon (Frances Robnett Smith Lyon), Montague Lyon, Jr., eating ice cream cones at the 1904 World’s Fair. Snapshot photograph, 1904.  (Missouri History Museum)

The 1904 World’s Fair in St. Louis: site of the national debuts of peanut butter, the hot dog, Dr Pepper, iced tea, cotton candy– and of course, ice cream cones. (source)

 

Written by LW

July 23, 2019 at 1:01 am

%d bloggers like this: