(Roughly) Daily

Posts Tagged ‘Microsoft

“Cyberspace undeniably reflects some form of geography”*…

Your correspondent in stepping again into the rapids, so (Roughly) Daily is going into a short hiatus. Regular service should resume on or around Nov 4. Here, something to enjoy in the meantime…

Our old friend Neal Agarwal has created an interactive museum of sorts, a stroll through the history of the internet, as manifest in the artifacts of important “firsts”– the first smiley, the first MP3, the first “LOL.” the first live-steamed concert, and so, so much more…

Browse through Internet Artifacts, from @nealagarwal.

* Sandra Day O’Connor

###

As we touch the exhibits, we might send imperial birthday greetings to William Henry Gates III; he was born on this date in 1955. Gates is, of course, best known for co-founding the technology giant Microsoft, along with his childhood friend Paul Allen. He led the company from its packaged software beginnings onto the internet. After leaving the company in 2008, he founded several other companies, including BENCascade InvestmentTerraPowerbgC3, and Breakthrough Energy; but he has increasingly turned his attention to philanthropy.

Bill Gates

source

Written by (Roughly) Daily

October 28, 2023 at 1:00 am

“In order for the United States to do the right things for the long term, it appears to be helpful for us to have the prospect of humiliation. Sputnik helped us fund good science – really good science: the semiconductor came out of it.”*…

Now the question is the semiconductor itself… and as Arthur Goldhammer explains in his review of Chris Miller‘s important new book Chip War, the answer may not be as clear as many suggest…

In left-liberal circles there is a rough consensus about what has gone wrong with our politics over the past 40 years. The critique can be summed up in two words: neoliberalism and globalization. Although these capacious ideological generalizations cover a multitude of sins, the gravamen of the charge against both is that, in the name of economic efficiency and growth, globalizing neoliberals of both the right and the left justified depriving national governments of the power to reduce inequalities of wealth and income, promote equal opportunity, and protect the health and welfare of the citizenry. Neoliberals prioritized property rights over social and political rights and protected markets from political meddling. They removed regulatory fetters on the movement of capital and sought the cheapest labor they could find to put their money to work. As a result, from the late 1970s on, governments across the developed world retreated from the social democratic reforms credited with fostering the harmonious prosperity of the three decades following World War II—the period the French have dubbed les Trente Glorieuses—thereby triggering a populist and xenophobic backlash while polarizing previously consensual political systems and weakening resistance to authoritarian demagogues.

This account of political change across the Western world since the 1980s has much to recommend it, not least the implication that the globalized neoliberal regime has sown the seeds of its own impending demise. This is the view espoused in one form or another by a number of excellent recent books, among them Gary Gerstle’s The Rise and Fall of the Neoliberal Order, Michael Tomasky’s The Middle Out, and Bradford DeLong’s Slouching Towards Utopia. Yet each of these estimable authors embraces the notion that the novel feature of the period was superstructural, to borrow a term of art from the Marxist lexicon: All believe that ideology was in the driver’s seat and that it was the readiness of left-liberals to accede to the tenets of market-first ideology that established neoliberalism as the unsurpassable political horizon of the age (to borrow a phrase from philosopher Jean-Paul Sartre).

But what if this superstructural interpretation is incomplete? What if it blinds us to a deeper transformation of the means of production themselves? What if the key innovation of the 1970s and ’80s was the advent not of neoliberal ideology but of the microprocessor, which simultaneously created new markets, dramatically altered trade flows, and shifted both the economic and military balance of power among nations? And what if this crucial technological innovation can trace its roots all the way back to the aforementioned Trente Glorieuses? What if the glory years of social democracy saw the benefits of higher education spread much more widely than ever before, disseminating technological skills throughout the world and making it possible to tap far more of humanity’s collective brainpower, while creating a web of interdependent corporations spanning both the developed and less developed worlds? The microprocessor not only became the flagship product of the neoliberal era’s dominant industry but also served as its indispensable instrument, without which it would have been impossible to tame the torrents of information necessary to manage far-flung supply chains and global capital flows.

Chris Miller’s Chip War deserves credit precisely for redirecting our attention from superstructure to base, from the high political drama of the past four decades to the more prosaic business of manufacturing microchips. At its most basic level, the book offers a masterful history of the semiconductor industry, from the invention of the first transistor in 1947 to the incredibly complex machinery required to deposit tens of billions of nearly atom-sized switches on a silicon chip no larger than a fingernail. Miller, who teaches international history at Tufts University’s Fletcher School, emphasizes the national security implications of a global supply chain in which components crucial to U.S. defense must pass through choke points such as Taiwan subject to intervention by commercial and strategic rivals. But the history he recounts in vivid detail also tells a more hopeful story, illustrating the way in which globalization has made it possible to mobilize humanity’s collective brainpower to achieve progress that no single country could have achieved on its own.

In assessing the national security risks posed by China’s semiconductor ambitions, some analysts seem to have accepted Andy Grove’s adage that “only the paranoid survive” at face value. While one former UK intelligence official argued that “we should accept that China will be a global tech power in the future and start managing the risk,” the United States, taking a darker view of China’s aims, has set out to stop China in its tracks by pressuring allies to reject Huawei chips and by banning the export of certain U.S.-developed technologies to China, most notably with the CHIPS Act of 2022 and related legislation.

Such aggressive policies could backfire, however. Miller quotes China tech policy analyst Dan Wang, who argues that American restrictions have “boosted Beijing’s quest for tech dominance” by catalyzing new Chinese government policies that support their local chip industry, including the training of tens of thousands of electrical engineers and condensed matter physicists. There are good reasons to worry about China’s military ambitions, but it is probably futile to try to halt the spread of technology as though it were a bulk good susceptible to blockade. There are also less aggressive ways to alleviate Chinese threats to the global supply chain: For instance, U.S. incentives have encouraged TSMC to move some of its operations from Taiwan to Arizona.

Finally, history shows that trying to stymie competitors by impeding the flow of technical information is unlikely to work against an adversary like China, with a large pool of educated workers and substantial ability to invest in research and development. Remember that Britain tried to monopolize early nineteenth-century textile technology, but Samuel Slater, the “father of the American Industrial Revolution,” used his knowledge of British machine designs to develop better technology in his adopted country. The way to compete effectively with China is not to ratchet up bellicose rhetoric about defending Taiwan or attempt to halt the spread of technical know-how by drafting new CHIP Acts, but to educate American workers and foster closer cooperation with other countries that have taken the lead in developing key aspects of the semiconductor manufacturing process. The history that Miller recounts demonstrates that what matters most in achieving technological leadership is free movement of people and ideas, not tariffs, export controls, or paranoid levels of fear. The best counterweight to Chinese military and commercial ambitions is the collective brainpower of the democratic world, not chip embargoes and saber-rattling…

The United States wants to stop China’s semiconductor industry in its tracks. Here’s how that could backfire: “Chip Shots,” from @artgoldhammer in @DemJournal. Eminently worth reading in full.

See also: “No, I Do Not Think the Microprocessor Doomed Social Democracy,” an elaboration on and response to Goldhammer from Brad DeLong (@delong).

* Bill Gates

###

As we ponder policy, we might recall that it was on this date in 1980 that Microsoft launched its first hardware product, the Z-80 SoftCard.

The brainchild of Paul Allen, the SoftCard was a microprocessor that plugged into the Apple II personal computer, allowing it to run programs written for the CP/M operating system. CP/M was a very popular OS for early personal computers, one for which much software was written. Indeed, the word processor WordStar was so popular that users purchased the SoftCard and a companion “80-column card” just to run it on the Apple II. At one point, the SoftCard product brought in about half of Microsoft’s total revenue. It was discontinued in 1986 as CP/M’s popularity waned in the face of competition from Microsoft’s own MS-DOS (and the growing popularity of Microsoft’s Word and Excel applications).

source

“I like boring things”*…

What’s not to like?…

A Youtube video titled “THE MOST BORING VIDEO EVER MADE (Microsoft Word tutorial, 1989) has accrued over 1.5 million views despite its self-proclaimed boringness. The video, an hour and forty-seven-minute computer tutorial, appears to have been recorded in one long take. It’s a time capsule to the early days of home computers and despite the monotonous, sleep-inducing narration, the instructions are quite thorough. In the video’s comments, viewers point out the mind-blowing drama at minute 59 and the charming quote “no ‘command m’ for ‘miracle.'”

1989 Microsoft Word tutorial is ‘the most boring video ever made’,” from Annie Rauwerda @BoingBoing

Pair with this 1984 video of Stanley Kubrick discussing his favorite software manuals:

* Andy Warhol

###

As we take on tedium, we might send qualified birthday greetings to Edward William Bok; he was born on this date in 1863. An editor and Pulitzer Prize-winning author, he is best remembered for his 30-year stewardship of the Ladies’ Home Journal.

Bok’s overall concern was to promote his socially conservative vision of the ideal American household, with the wife as homemaker and child-rearer. At the Ladies Home Journal, Bok authored more than twenty articles opposed to women’s suffrage, women working outside the home, woman’s clubs, and education for women. He wrote that feminism would lead women to divorce, ill health, and even death. Bok viewed suffragists as traitors to their sex, saying “there is no greater enemy of woman than woman herself.”

(See here for a glimpse at his ambitions and impact.)

source

“Deciding what not to do is as important as deciding what to do”*…

 

dont buy

 

Wirecutter is best known for recommending things that are the best of the best. But on occasion, we discover the worst of the worst.

Sometimes this happens during testing (like when we had to force down countless cups of bad Keurig coffee), or when an entire category fails to deliver (like great-smelling but useless essential oil bug repellents), or just because a thing has no business even existing (we’re looking at you, air fryers)…

A list of products to which we should just say no: “Wirecutter’s Worst Things for Most People.”

* Steve Jobs

###

As we resist the urge, we might recall that it was on this date in 1995 that (to the commercial accompaniment of The Rolling Stones’ “Start Me Up”) Microsoft released Windows 95 to retail.

300px-Windows_95_at_first_run source

 

Written by (Roughly) Daily

August 24, 2020 at 1:01 am

“There are two ways to make money in business: bundling and unbundling”*…

bundle

Many ventures seek profit by repackaging existing goods and services as revenue streams they can control, with technology frequently serving as the mechanism. The tech industry’s mythology about itself as a “disruptor” of the status quo revolves around this concept: Inefficient bundles (newspapers, cable TV, shopping malls) are disaggregated by companies that serve consumers better by letting them choose the features they want as stand-alone products, unencumbered of their former baggage. Why pay for a package of thousands of unwatched cable television channels, when you can pay for only the ones you watch? Who wants to subsidize journalism when all you care about is sports scores?

Media has been the most obvious target of digital unbundling because of the internet’s ability to subsume other forms and modularize their content. But almost anything can be understood as a bundle of some kind — a messy entanglement of variously useful functions embedded in a set of objects, places, institutions, and jobs that is rarely optimized for serving a single purpose. And accordingly, we hear promises to unbundle more and more entities. Transportation systems are being unbundled by various ridesharing and other mobility-as-a-service startups, causing driving, parking, navigation, and vehicle maintenance to decouple from their traditional locus in the privately owned automobile. Higher education, which has historically embedded classroom learning in an expensive bundle that often includes residence on campus and extracurricular activities, is undergoing a similar change via tools for remote learning…

Things that have been unbundled rarely remain unbundled for very long. Whether digital or physical, people actually like bundles, because they supply a legible social structure and simplify the complexity presented by a paralyzing array of consumer choices. The Silicon Valley disruption narrative implies that bundles are suboptimal and thus bad, but as it turns out, it is only someone else’s bundles that are bad: The tech industry’s unbundling has actually paved the way for invidious forms of rebundling. The apps and services that replaced the newspaper are now bundled on iPhone home screens or within social media platforms, where they are combined with new things that no consumer asked for: advertising, data mining, and manipulative interfaces. Facebook, for instance, unbundled a variety of long-established social practices from their existing analog context — photo sharing, wishing a friend happy birthday, or inviting someone to a party — and recombined them into its new bundle, accompanied by ad targeting and algorithmic filtering. In such cases, a bundle becomes less a bargain than a form of coercion, locking users into arrangements that are harder to escape than what they replaced. Ironically, digital bundles like Facebook also introduce novel ambiguities and adjacencies in place of those they sought to eliminate, such as anger about the political leanings of distant acquaintances or awareness of social gatherings that happened without you (side effects that are likely to motivate future unbundling efforts in turn)…

In a consideration of one of the most fundamental dynamics afoot in our economy today, and of its consequences, Drew Austin observes that no goods or services are stand-alone: “Bundling and Unbundling.”

* Jim Barksdale (in 1995, when he was the CEO of Netscape)

###

As we contemplate connection, we might recall that it was on this date in 1980 that IBM and Microsoft signed the agreement that made Microsoft the supplier of the operating system for the soon-to-be-released IBM PC.  IBM had hoped to do a deal with Digital Research (the creators of CP/M), but DR would not sign an NDA.

On Nov. 6, 1980, the contract that would change the future of computing was signed: IBM would pay Microsoft $430,000 for what would be called MS-DOS. But the key provision in that agreement was the one that allowed Microsoft to license the operating system to other computer manufacturers besides IBM — a nonexclusive arrangement that IBM agreed to in part because it was caught up in decades of antitrust investigations and litigation. IBM’s legal caution, however, would prove to be Microsoft’s business windfall, opening the door for the company to become the dominant tech company of the era.

Hundreds of thousands of IBM computers were sold with MS-DOS, but more than that, Microsoft became the maker of the crucial connection that was needed between the software and hardware used to operate computers. Company revenue skyrocketed from $16 million in 1981 to $140 million in 1985 as other computer-makers like Tandy and Commodore also chose to partner with them.

And as Microsoft’s fortunes rose, IBM’s declined. The company known as Big Blue, which had once been the largest in America, and 3,000 times the size of Microsoft, lost control of the PC platform it had helped build as software became more important than hardware.  [source]

Microsoft Founders Paul Allen and Bill Gates

Paul Allen and Bill Gates in those early years

source

 

Written by (Roughly) Daily

November 6, 2019 at 1:01 am

%d