(Roughly) Daily

Posts Tagged ‘semiconductor

“Humanity is acquiring all the right technology for all the wrong reasons”*…

Further to yesterday’s post on the poverty created by manufacturing displacement, and in the wake of the sturm und drang occasioned by the coup at OpenAI, the estimable Rana Foroohar on the politics of AI…

… Consider that current politics in the developed world — from the rise of Donald Trump to the growth of far right and far left politics in Europe — stem in large part from disruptions to the industrial workforce due to technology and globalisation. The hollowing out of manufacturing work led to more populist and fractious politics, as countries tried (and often failed) to balance the needs of the global marketplace with those of voters.

Now consider that this past summer, the OECD warned that white-collar, skilled labour representing about a third of the workforce in the US and other rich countries is most at risk from disruption by AI. We are already seeing this happen in office work — with women and Asians particularly at risk since they hold a disproportionate amount of roles in question. As our colleague John Burn-Murdoch has charted [image above], online freelancers are especially vulnerable.

So, what happens when you add more than three times as many workers, in new subgroups, to the cauldron of angry white men that have seen their jobs automated or outsourced in recent decades? Nothing good. I’m always struck when CEOs like Elon Musk proclaim that we are headed towards a world without work as if this is a good thing. As academics like Angus Deaton and Anne Case have laid out for some time now, a world without work very often leads to “deaths of despair,” broken families, and all sorts of social and political ills.

Now, to be fair, Goldman Sachs has estimated that the productivity impact of AI could double the recent rate — mirroring the impact of the PC revolution. This would lead to major growth which could, if widely shared, do everything from cut child poverty to reduce our burgeoning deficit.

But that’s only if it’s shared. And the historical trend lines for technology aren’t good in that sense — technology often widens wealth disparities before labour movements and government regulation equalise things. (Think about the turn of the 20th century, up until the 1930s). But the depth and breadth of AI disruption may well cause unprecedented levels of global labour displacement and political unrest.

I am getting more and more worried that this is where we may be heading. Consider this new National Bureau of Economic Research working paper, which analyses why AI will be as transformative as the industrial revolution. It also predicts, however, that there is a very good chance that it lowers the labour share radically, even pushing it to zero, in lieu of policies that prevent this (the wonderful Daron Acemoglu and Simon Johnson make similar points, and lay out the history of such tech transformation in their book Power and Progress

We can’t educate ourselves out of this problem fast enough (or perhaps at all). We also can’t count on universal basic income to fix everything, no matter how generous it could be, because people simply need work to function (as Freud said, it’s all about work and love). Economists and political scientists have been pondering the existential risks of AI — from nuclear war to a pandemic — for years. But I wonder if the real existential crisis isn’t a massive crisis of meaning, and the resulting politics of despair, as work is displaced faster than we can fix the problem…

Everyone’s worried about AI, but are we worried about the right thing? “The politics of AI,” from @RanaForoohar in @FT.

See also: Henry Farrell‘s “What OpenAI shares with Scientology” (“strange beliefs, fights over money, and bad science fiction”) and Dave Karpf‘s “On OpenAI: Let Them Fight.” (“It’s chaos… And that’s a good thing.”)

For a different point-of-view, see: “OpenAI and the Biggest Threat in the History of Humanity,” from Tomás Pueyo.

And for deep background, read Benjamin Labatut‘s remarkable The MANIAC.

* R. Buckminster Fuller

###

As we equilibrate, we might recall that it was on this date in 1874 that electrical engineer, inventor, and physicist Ferdinand Braun published a paper in the Annalen der Physik und Chemie describing his discovery of the electrical rectifier effect, the original practical semiconductor device.

(Braun is better known for his contributions to the development of radio and television technology: he shared the 1909 Nobel Prize in Physics with Guglielmo Marconi “for their contributions to the development of wireless telegraphy” (Braun invented the crystal tuner and the phased-array antenna); was a founder of Telefunken, one of the pioneering communications and television companies; and (as the builder of the first cathode ray tube) has been called the “father of television” (shared with inventors like Paul Gottlieb Nipkow).

source

“The ‘paradox’ is only a conflict between reality and your feeling of what reality ‘ought to be’”*…

John Stewart Bell (1928-1990), the Northern Irish physicist whose work sparked a quiet revolution in quantum physics

Elegant experiments with entangled light have laid bare a profound mystery at the heart of reality. Daniel Garisto explains the importance of the work done by this year’s Nobel laureates in Physics…

One of the more unsettling discoveries in the past half century is that the universe is not locally real. “Real,” meaning that objects have definite properties independent of observation—an apple can be red even when no one is looking; “local” means objects can only be influenced by their surroundings, and that any influence cannot travel faster than light. Investigations at the frontiers of quantum physics have found that these things cannot both be true. Instead, the evidence shows objects are not influenced solely by their surroundings and they may also lack definite properties prior to measurement. As Albert Einstein famously bemoaned to a friend, “Do you really believe the moon is not there when you are not looking at it?”

This is, of course, deeply contrary to our everyday experiences. To paraphrase Douglas Adams, the demise of local realism has made a lot of people very angry and been widely regarded as a bad move.

Blame for this achievement has now been laid squarely on the shoulders of three physicists: John Clauser, Alain Aspect and Anton Zeilinger. They equally split the 2022 Nobel Prize in Physics “for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science.” (“Bell inequalities” refers to the pioneering work of the Northern Irish physicist John Stewart Bell, who laid the foundations for this year’s Physics Nobel in the early 1960s.) Colleagues agreed that the trio had it coming, deserving this reckoning for overthrowing reality as we know it. “It is fantastic news. It was long overdue,” says Sandu Popescu, a quantum physicist at the University of Bristol. “Without any doubt, the prize is well-deserved.”

“The experiments beginning with the earliest one of Clauser and continuing along, show that this stuff isn’t just philosophical, it’s real—and like other real things, potentially useful,” says Charles Bennett, an eminent quantum researcher at IBM…

Quantum foundations’ journey from fringe to favor was a long one. From about 1940 until as late as 1990, the topic was often treated as philosophy at best and crackpottery at worst. Many scientific journals refused to publish papers in quantum foundations, and academic positions indulging such investigations were nearly impossible to come by…

Today, quantum information science is among the most vibrant and impactful subfields in all of physics. It links Einstein’s general theory of relativity with quantum mechanics via the still-mysterious behavior of black holes. It dictates the design and function of quantum sensors, which are increasingly being used to study everything from earthquakes to dark matter. And it clarifies the often-confusing nature of quantum entanglement, a phenomenon that is pivotal to modern materials science and that lies at the heart of quantum computing…

Eminently worth reading in full: “The Universe Is Not Locally Real, and the Physics Nobel Prize Winners Proved It,” from @dangaristo in @sciam.

Apposite: entangled particles and wormholes could be manifestations of the same phenomenon, and resolve paradoxes like information escaping a black hole: “Black Holes May Hide a Mind-Bending Secret About Our Universe.” 

Richard Feynman

###

As we rethink reality, we might spare a thought for Walter Brattain; he died on this date in 1987. A physicist (at Bell Labs at the time), he worked with John Bardeen and William Shockley to invent the point-contact transistor in 1947, the birth of the semiconductor– work for which the trio shared the Nobel Prize in Physics in 1956.

At college, Brattain said, he majored in physics and math because they were the only subjects he was good at. He became a solid physicist with a good understanding of theory, but his strength was in physically constructing experiments. Working with the ideas of Shockley and Bardeen, Brattain’s hands built the first transistor. Shortly, the transistor replaced the bulkier vacuum tube for many uses and was the forerunner of microminiature electronic parts.

As semiconductor technology has advanced, it has begun to incorporate quantum effects.

source

“Artificial intelligence is growing up fast”*…

A simple prototype system sidesteps the computing bottleneck in tuning– teaching– artificial intelligence algorithms…

A simple electrical circuit [pictured above] has learned to recognize flowers based on their petal size. That may seem trivial compared with artificial intelligence (AI) systems that recognize faces in a crowd, transcribe spoken words into text, and perform other astounding feats. However, the tiny circuit outshines conventional machine learning systems in one key way: It teaches itself without any help from a computer—akin to a living brain. The result demonstrates one way to avoid the massive amount of computation typically required to tune an AI system, an issue that could become more of a roadblock as such programs grow increasingly complex.

“It’s a proof of principle,” says Samuel Dillavou, a physicist at the University of Pennsylvania who presented the work here this week at the annual March meeting of the American Physical Society. “We are learning something about learning.”…

More at “Simple electrical circuit learns on its own—with no help from a computer, from @ScienceMagazine.

* Diane Ackerman

###

As we brace ourselves (and lest we doubt the big things can grow from humble beginnings like these), we might recall that it was on this date in 1959 that Texas Instruments (TI) demonstrated the first working integrated circuit (IC), which had been invented by Jack Kilby. Kilby created the device to prove that resistors and capacitors could exist on the same piece of semiconductor material. His circuit consisted of a sliver of germanium with five components linked by wires. It was Fairchild’s Robert Noyce, however, who filed for a patent within months of Kilby and who made the IC a commercially-viable technology. Both men are credited as co-inventors of the IC. (Kilby won the Nobel Prize for his work in 2000; Noyce, who died in 1990, did not share.)

Kilby and his first IC (source)

“Please cut off a nanosecond and send it over to me”*…

960px-Commodore_Grace_M._Hopper,_USN_(covered)

 

“Amazing  Grace” Hopper, seminal computer scientist and Rear Admiral in the U.S. Navy, explains a “nanosecond”…

* Grace Hopper

###

As we celebrate clarity, we might recall that it was on this date in 1961 that Robert Noyce was issued patent number 2981877 for his “semiconductor device-and-lead structure,” the first patent for what would come to be known as the integrated circuit.  In fact another engineer, Jack Kilby, had separately and essentially simultaneously developed the same technology (Kilby’s design was rooted in germanium; Noyce’s in silicon) and had filed a few months earlier than Noyce… a fact that was recognized in 2000 when Kilby was Awarded the Nobel Prize– in which Noyce, who had died in 1990, did not share.

Noyce (left) and Kilby (right)

 source

 

 

 

Written by (Roughly) Daily

April 25, 2019 at 1:01 am

%d