(Roughly) Daily

Posts Tagged ‘IBM

“There are two ways to make money in business: bundling and unbundling”*…

bundle

Many ventures seek profit by repackaging existing goods and services as revenue streams they can control, with technology frequently serving as the mechanism. The tech industry’s mythology about itself as a “disruptor” of the status quo revolves around this concept: Inefficient bundles (newspapers, cable TV, shopping malls) are disaggregated by companies that serve consumers better by letting them choose the features they want as stand-alone products, unencumbered of their former baggage. Why pay for a package of thousands of unwatched cable television channels, when you can pay for only the ones you watch? Who wants to subsidize journalism when all you care about is sports scores?

Media has been the most obvious target of digital unbundling because of the internet’s ability to subsume other forms and modularize their content. But almost anything can be understood as a bundle of some kind — a messy entanglement of variously useful functions embedded in a set of objects, places, institutions, and jobs that is rarely optimized for serving a single purpose. And accordingly, we hear promises to unbundle more and more entities. Transportation systems are being unbundled by various ridesharing and other mobility-as-a-service startups, causing driving, parking, navigation, and vehicle maintenance to decouple from their traditional locus in the privately owned automobile. Higher education, which has historically embedded classroom learning in an expensive bundle that often includes residence on campus and extracurricular activities, is undergoing a similar change via tools for remote learning…

Things that have been unbundled rarely remain unbundled for very long. Whether digital or physical, people actually like bundles, because they supply a legible social structure and simplify the complexity presented by a paralyzing array of consumer choices. The Silicon Valley disruption narrative implies that bundles are suboptimal and thus bad, but as it turns out, it is only someone else’s bundles that are bad: The tech industry’s unbundling has actually paved the way for invidious forms of rebundling. The apps and services that replaced the newspaper are now bundled on iPhone home screens or within social media platforms, where they are combined with new things that no consumer asked for: advertising, data mining, and manipulative interfaces. Facebook, for instance, unbundled a variety of long-established social practices from their existing analog context — photo sharing, wishing a friend happy birthday, or inviting someone to a party — and recombined them into its new bundle, accompanied by ad targeting and algorithmic filtering. In such cases, a bundle becomes less a bargain than a form of coercion, locking users into arrangements that are harder to escape than what they replaced. Ironically, digital bundles like Facebook also introduce novel ambiguities and adjacencies in place of those they sought to eliminate, such as anger about the political leanings of distant acquaintances or awareness of social gatherings that happened without you (side effects that are likely to motivate future unbundling efforts in turn)…

In a consideration of one of the most fundamental dynamics afoot in our economy today, and of its consequences, Drew Austin observes that no goods or services are stand-alone: “Bundling and Unbundling.”

* Jim Barksdale (in 1995, when he was the CEO of Netscape)

###

As we contemplate connection, we might recall that it was on this date in 1980 that IBM and Microsoft signed the agreement that made Microsoft the supplier of the operating system for the soon-to-be-released IBM PC.  IBM had hoped to do a deal with Digital Research (the creators of CP/M), but DR would not sign an NDA.

On Nov. 6, 1980, the contract that would change the future of computing was signed: IBM would pay Microsoft $430,000 for what would be called MS-DOS. But the key provision in that agreement was the one that allowed Microsoft to license the operating system to other computer manufacturers besides IBM — a nonexclusive arrangement that IBM agreed to in part because it was caught up in decades of antitrust investigations and litigation. IBM’s legal caution, however, would prove to be Microsoft’s business windfall, opening the door for the company to become the dominant tech company of the era.

Hundreds of thousands of IBM computers were sold with MS-DOS, but more than that, Microsoft became the maker of the crucial connection that was needed between the software and hardware used to operate computers. Company revenue skyrocketed from $16 million in 1981 to $140 million in 1985 as other computer-makers like Tandy and Commodore also chose to partner with them.

And as Microsoft’s fortunes rose, IBM’s declined. The company known as Big Blue, which had once been the largest in America, and 3,000 times the size of Microsoft, lost control of the PC platform it had helped build as software became more important than hardware.  [source]

Microsoft Founders Paul Allen and Bill Gates

Paul Allen and Bill Gates in those early years

source

 

Written by LW

November 6, 2019 at 1:01 am

“One of the things I did not understand, was that these systems can be used to manipulate public opinion in ways that are quite inconsistent with what we think of as democracy”*…

 

bk_134_howard_rheingold

Nineteen years ago, in his third annual call for answers to an Annual Question, John Brockman asked members of the Edge community what they believed to be “today’s [2000’s] most important unreported story.” The remarkable Howard Rheingold (@hrheingold) answered in a way that has turned out to be painfully prophetic…

The way we learn to use the Internet in the next few years (or fail to learn) will influence the way our grandchildren govern themselves. Yet only a tiny fraction of the news stories about the impact of the Net focus attention on the ways many to-many communication technology might be changing democracy — and those few stories that are published center on how traditional political parties are using the Web, not on how grassroots movements might be finding a voice…

Every communication technology alters governance and political processes. Candidates and issues are packaged and sold on television by the very same professionals who package and sell other commodities. In the age of mass media, the amount of money a candidate can spend on television advertising is the single most important influence on the electoral success. Now that the Internet has transformed every desktop into a printing press, broadcasting station, and place of assembly, will enough people learn to make use of this potential? Or will our lack of news, information, and understanding of the Net as a political tool prove insufficient against the centralization of capital, power, and knowledge that modern media also make possible?…

The political power afforded to citizens by the Web is not a technology issue. Technology makes a great democratization of publishing, journalism, public discourse possible, but does not determine whether or not that potential will be realized. Every computer connected to the Net can publish a manifesto, broadcast audio and video eyewitness reports of events in real time, host a virtual community where people argue about those manifestos and broadcasts. Will only the cranks, the enthusiasts, the fringe groups take advantage of this communication platform? Or will many-to-many communication skills become a broader literacy, the way knowing and arguing about the issues of the day in print was the literacy necessary for the American revolution?…

The Scylla and Charybdis of which Howard warned– centralization-by-capital/political power and atomization-into-cacophony (whether via the pollution of manipulation/”fake news” or simple tribalism)– is now all too apparent… even if it’s not at all clear how we sail safely between them.  It’s almost 20 years later– but not too late to heed Howard’s call, which you can read in full at “How Will The Internet Influence Democracy?

* Eric Schmidt, Executive Chairman of Google [as Howard’s 2000 insight dawns on him in 2017, source]

###

As we try harder, we might recall that it was on this date in 1911 that financier and “Father of Trusts” Charles R. Flint incorporated The Computing-Tabulating-Recording Company as a holding company into which he rolled up manufacturers of record-keeping and measuring systems: Bundy Manufacturing Company, International Time Recording Company, The Tabulating Machine Company, and the Computing Scale Company of America.

Four years later Flint hired Thomas J. Watson, Sr. to run the company; nine years after that, in 1924, Watson organized the formerly disparate units into a single operating company, which he named “International Business Machines,” or as we now know it, IBM.

150px-CTR_Company_Logo source

 

 

“The number of transistors on integrated circuits doubles approximately every two years”*…

 

Moore’s Law has held up almost astoundingly well…

 source (and larger version)

This seemingly inexorable march has enabled an extraordinary range of new products and services– from intercontinental ballistic missiles to global environmental monitoring systems and from smart phones to medical implants…  But researchers at Carnegie Mellon University are sounding an alarm…

The speed of our technology doubles every year, right? Not anymore. We’ve come to take for granted that as the years go on, computing technology gets faster, cheaper and more energy-efficient.

In their recent paper, “Science and research policy at the end of Moore’s law” published in Nature Electronics, however, Carnegie Mellon University researchers Hassan Khan, David Hounshell, and Erica Fuchs argue that future advancement in microprocessors faces new and unprecedented challenges…

In the seven decades following the invention of the transistor at Bell Labs, warnings about impending limits to miniaturization and the corresponding slow down of Moore’s Law have come regularly from industry observers and academic researchers. Despite these warnings, semiconductor technology continually progressed along the Moore’s Law trajectory. Khan, Hounshell, and Fuchs’ archival work and oral histories, however, make clear that times are changing.

“The current technological and structural challenges facing the industry are unprecedented and undermine the incentives for continued collective action in research and development,” the authors state in the paper, “which has underpinned the last 50 years of transformational worldwide economic growth and social advance.”

As the authors explain in their paper, progress in semiconductor technology is undergoing a seismic shift driven by changes in the underlying technology and product-end markets…

To continue advancing general purpose computing capabilities at reduced cost with economy-wide benefits will likely require entirely new semiconductor process and device technology.” explains Engineering and Public Policy graduate Hassan Khan. “The underlying science for this technology is as of yet unknown, and will require significant research funds – an order of magnitude more than is being invested today.”

The authors conclude by arguing that the lack of private incentives creates a case for greatly increased public funding and the need for leadership beyond traditional stakeholders. They suggest that funding is needed of $600 million dollars per year with 90% of those funds from public research dollars, and the rest most likely from defense agencies…

Read the complete summary at “Moore’s law has ended. What comes next?“; read the complete Nature article here.

* a paraphrase of Gordon’s Moore’s assertion– known as “Moore’s law”– in the thirty-fifth anniversary issue of Electronics magazine, published on April 19, 1965

###

As we pack ’em ever tighter, we might send carefully-computed birthday greetings to Thomas John Watson Sr.; he was born on this date in 1874.  A mentee of from John Henry Patterson’s at NCR, where Watson began his career, Watson became the chairman and CEO of the Computing-Tabulating-Recording Company (CTR), which, in 1924, he renamed International Business Machines– IBM.  He began using his famous motto– THINK– while still at NCR, but carried it with him to IBM…  where it became that corporation’s first trademark (in 1935).  That motto was the inspiration for the naming of the Thinkpad– and Watson himself (along with Sherlock’s Holmes’ trusty companion), for the naming of IBM’s Artificial Intelligence product.

 source

 

“Don’t believe anything you read on the net. Except this. Well, including this, I suppose.”*…

 

Just a month ago, it was revealed that Facebook has more than two billion active monthly users. That means that in any given month, more than 25% of Earth’s population logs in to their Facebook account at least once.

This kind of scale is almost impossible to grasp.

Here’s one attempt to put it in perspective: imagine Yankee Stadium’s seats packed with 50,000 people, and multiply this by a factor of 40,000. That’s about how many different people log into Facebook every month worldwide.

The Yankee Stadium analogy sort of helps, but it’s still very hard to picture. The scale of the internet is so great, that it doesn’t make sense to look at the information on a monthly basis, or even to use daily figures.

Instead, let’s drill down to just what happens in just one internet minute…

More at “What Happens in an Internet Minute in 2017?

And for a cogent consideration of what all this might mean, see “You Are the Product.”

* Douglas Adams

###

As we retreat behind the firewall, we might recall that it was on this date in 1980 that The Project Chess team at IBM showed a prototype microcomputer to their corporate management. Management gave approval– and a one-year deadline– for the team to build an operational computer to compete in the rapidly emerging personal computer market. One year and 4 days later, the IBM PC was introduced to the world… and the rest is history.

 source

 

Written by LW

August 8, 2017 at 1:01 am

“In any field, it is easy to see who the pioneers are — they are the ones lying face down with arrows in their backs”*…

 

The story of Vector Graphic, a personal computer company that outran Apple in their early days: “How Two Bored 1970s Housewives Helped Create the PC Industry.”

* Anonymous

###

As we try to remember what “CP/M” stood for, we might recall that it was on this date in 1991 (on the anniversary of the issuing of IBM’s first patent in 1911) that  Microsoft Corp. for the first time reported revenues of more than $1 billion for its fiscal year (1990), the first software company ever to achieve that scale.  While in this age of ‘unicorns,” a billion dollars in revenue seems a quaint marker, that was real money at the time.

As readers who followed the link above will know, Microsoft, founded in 1975, was an early purveyor of the CP/M operating system on which the Vector ran; but (unlike Vector) Gates and Allen embraced IBM’s new architecture, creating DOS (for younger readers: the forerunner of Windows)… and laying the foundation for Microsoft’s extraordinary growth.

Bill Gates in 1990

 source

 

Written by LW

July 25, 2015 at 1:01 am

I for one welcome our new computer overlords…

source

In the aftermath of Watson’s triumph over humanity’s best, your correspondent thought it wise to remind readers (and himself) that this is not the first time that we mortals have faced the onslaught of astounding new technology.

The good folks at Dark Roasted Blend have compiled a nifty through-the-ages recap of attempts to create “life” in new-fangled ways; from Leonardo’s “robot” and John Dee’s “flying beetle” to an “steam-powered hiker” and an “electric milk man” from Victorian England, there’s quite a selection in “Amazing Automatons: Ancient Robots & Victorian Androids.”

It’s all fascinating; but the sweet spot is surely the selection of creations from the 18th (and early 19th) centuries, when the then-highly-developed crafts of metal working and watchmaking were turned to automata.  Consider, for example…

Jacques Vaucason created numerous working figures, including a flute player, which actually played the instrument, in 1738, plus this duck from 1739. The gilded copper bird could sit, stand, splash around in water, quack and even give the impression of eating food and digesting it.

Pierre Jaquet-Doz created three automata, The Writer, The Draughtsman and The Musician, which are still considered scientific marvels today. The Draughtsman is capable of producing four distinct pictures, while the Writer dips his pen in the ink and can write as many as forty letters. The Musician’s fingers actually play the organ and the figure ends her performance with a bow.

More, at Dark Roasted Blend.

As we remind ourselves to re-read Kevin Kelly’s excellent What Technology Wants and then to retake the Turing Test, we might stage a dramatic memorial dramatist and scenic innovator James Morrison Steele (“Steele”) MacKaye; he died on this date in 1894.  He opened the Madison Square Theatre in 1879, where he created a huge elevator with two stages stacked one on top of the other so that elaborate furnishings could be changed quickly between scenes. MacKaye was the first to light a New York theatre– the Lyceum, which he founded in 1884– entirely by electricity. And he invented and installed overhead and indirect stage lighting, movable stage wagons, artificial ventilation, the disappearing orchestra pit, and folding seats. In all, MacKaye patented over a hundred inventions, mostly for the improvement of theatrical production and its experience.

Steele MacKaye

Let’s get small…

Gerd Binnig and Heinrich Rohrer, inventors of the Scanning Tunneling Microscope (source: IBM)

Twenty years ago, technicians at IBM’s Almaden Research Lab pulled a nifty stunt with their scanning tunneling microscope (STM).  IBM scientists had invented the STM nine years earlier in IBM’s Zurich Lab (and received a Nobel prize for it in 1996); while the STM was originally intended simply to create visualizations of things very, very tiny, the folks at Almaden realized that the technique used– it “felt” the atoms in question with similarly-charged particles, then mapped the object– could be reversed:  the STM could change it’s charge, “pin” an atom, and move it…  The first illustration– and, some argue, the first example of “practical” nanotechnology– was this IBM logo, “written” in xenon atoms:

source: IBM

Over the last two decades, the STM has become a critical tool for chip makers, enabling them to perfect  current DRAM and flash memories.  Now, the folks at Almaden, still pushing the limits of their gear, they’ve turned their STMs into slo-mo movie cameras, and captured the atomic process of setting and erasing a bit on a single atom– that’s to say, of the operation of a single-atom DRAM.

Practical applications- atomic memories, better solar cells, and ultimately, atomic scale quantum computers– are, of course, some way off… but Moore’s Law seems safe for awhile.

Read all about it in EE Times.

As we drop the needle on that Steve Martin album, we might recall that it was on this date in 1908 that the Model T went on sale; it cost $825 (roughly equivalent to $20,000) today.  Ford’s advances in the technologies used both in the car and in its manufacture, along with economies of scale,  resulted in  steady price reductions over the next decade: by the 1920s, the price had fallen to $290 (equivalent to roughly $3,250 today).

1908 advertisement

%d bloggers like this: