Posts Tagged ‘iPhone’
“Technological change is not additive; it is ecological. A new technology does not merely add something; it changes everything”*…
Insofar as (at the risk of sounding tautological) transformative technologies are concerned, Neil Postman is surely right. But then, as Roy Amara pointed out, “we tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.” David Oks uses a common myth of technological replacement to illustrate– and more specifically, to observe that there’s a lot more to replacing labor than just automating tasks.
He begins by recounting an interview a few months ago of J. D. Vance by Ross Douthat in which (in response to a question from Douthat about the potential downsides of AI, in particular the prospect of its “obsoleting” human workers) Vance responded sanguinely, arguing that ATM machines didn’t eliminate bank tellers. Indeed, Vance suggested, “we have more bank tellers today than we did when the ATM was created, but they’re doing slightly different work…”
There are two interesting things about what Vance said, both relating to the example that he chose about bank tellers and ATMs.
The first thing is what it tells us about who J. D. Vance is. The bank teller story—how ATMs were predicted to increase bank teller unemployment, but in fact did not—isn’t a story you’ll hear from politicians; in fact, for a long time, Barack Obama would claim, incorrectly, that ATMs had decreased the number of bank tellers, in order to suggest that the elevated unemployment rate during his presidency was due to productivity gains from technology. I’ve never heard a politician cite the bank teller story before: but I have seen the bank teller story cited in a lot of blogs. I’ve seen it cited, for example, by Scott Alexander and Matt Yglesias and Freddie deBoer; and I’ve heard it, upstream of the humble bloggers, from such fine economists as Daron Acemoglu and David Autor. The story of how ATMs didn’t automate bank tellers is, indeed, something of a minor parable of the economics profession…
… But the other thing about the bank teller story that Vance cites is that it’s wrong. We do not, contrary to what Vance claims, have “more bank tellers today than we did when the ATM was created”: we in fact have far fewer. The story he tells Douthat might have been true in 2000 or 2005, but it hasn’t been true for years. Bank teller employment has fallen off a cliff. Here is a graph of bank teller employment since 2000:
So what happened to bank tellers? Autor, Bessen, Vance, and the like are right to point out that ATMs did not reduce bank teller employment. But they miss the second half of the story, which is that another technology did. And that technology was the iPhone. The huge decline in bank teller employment that we’ve seen over the last 15-odd years is mainly a story about iPhones and what they made possible.
But why? Why did the ATM, literally called the automated teller machine, not automate the teller, while an entirely orthogonal technology—the iPhone—actually did?
The answer, I think, is complementarity.
In my last piece, on why I don’t think imminent mass job loss from AI is likely, I talked a lot about complementarity. The core point I made was that labor substitution is about comparative advantage, not absolute advantage: the relevant question for labor impacts is not whether AI can do the tasks that humans can do, but rather whether the aggregate output of humans working with AI is inferior to what AI can produce alone. And I suggested that given the vast number of frictions and bottlenecks that exist in any human domain—domains that are, after all, defined around human labor in all its warts and eccentricities, with workflows designed around humans in mind—we should expect to see a serious gap between the incredible power of the technology and its impacts on economic life.
That gap will probably close faster than previous gaps did: AI is not “like” electricity or the steam engine; an AI system is literally a machine that can think and do things itself. But the gap exists, and will exist even as the technology continues to amaze us with what it can now accomplish.
But by talking about why ATMs didn’t displace bank tellers but iPhones did, I want to highlight an important corollary, which is that the true force of a technology is felt not with the substitution of tasks, but the invention of new paradigms. This is the famous lesson of electricity and productivity growth, which I’ll return to in a future piece. When a technology automates some of what a human does within an existing paradigm, even the vast majority of what a human does within it, it’s quite rare for it to actually get rid of the human, because the definition of the paradigm around human-shaped roles creates all sorts of bottlenecks and frictions that demand human involvement. It’s only when we see the construction of entirely new paradigms that the full power of a technology can be realized. The ATM substituted tasks; but the iPhone made them irrelevant…
[Oks unpacks the stories of the ATM’s and iPhone’s impact on banking, then looks ahead, by anaology, to what might be in store with AI. He concludes…]
… I am not a “denier” on the question of technological job loss; Vance’s blithe optimism is not mine. But I’m skeptical that simply slotting AI into human-shaped jobs will have the results people seem to expect. The history of technology, even exceptionally powerful general-purpose technology, tells us that as long as you are trying to fit capital into labor-shaped holes you will find yourself confronted by endless frictions: just as with electricity, the productivity inherent in any technology is unleashed only when you figure out how to organize work around it, rather than slotting it into what already exists. We are still very much in the regime of slotting it in. And as long as we are in that regime, I expect disappointing productivity gains and relatively little real displacement.
The real productivity gains from AI—and the real threat of labor displacement—will come not from the “drop-in remote worker,” but from something like Dwarkesh Patel’s vision of the fully-automated firm. At some point in the life of every technology, old workflows are replaced by new ones, and we discover the paradigms in which the full productive force of a technology can best be expressed. In the past this has simply been a fact of managerial turnover or depreciation cycles. But with AI it will likely be the sheer power of the technology itself, which really is wholly unlike anything that has come before, and unlike electricity or the steam engine will eventually be able to build the structures that harness its powers by itself.
I don’t think we’ve really yet learned what those new structures will look like. But, at the limit, I don’t quite know why humans have to be involved in those: though I suspect that by the time we’re dealing with the fully-automated organizations of the future, our current set of concerns will have been largely outmoded by new and quite foreign ones, as has always been the case with human progress.
But, however optimistic I might be about the human future, I don’t think it’s worth leaning on the history of past technologies for comfort. The ATM parable is a comforting narrative; and in times of uncertainty and fear we search naturally for solace and comfort wherever it may come. But even when it comes to bank tellers, it’s only the first half of the story…
Eminently worth reading in full: “Why ATMs didn’t kill bank teller jobs, but the iPhone did.”
As to whether the wisdom of Amara and Oks is widely-shared, consider this from Crunchbase:
Crunchbase data shows global venture investment totaled $189 billion in February — the largest startup funding month on record — although 83% of capital raised went to just three companies. They include OpenAI, which raised $110 billion, also in the largest round ever raised by a private, venture-backed company.
The record month for venture funding took place against the backdrop of a trillion-dollar stock market drop as AI compute and tooling unsettled leading public software companies. [See also here.]
All told, venture investment was up close to 780% year over year from the $21.5 billion raised by startups in February 2025.
OpenAI was not the only company to raise tens of billions of dollars last month. Its closest rival, Anthropic, raised $30 billion, marking the third-largest venture round on record.
Waymo, Alphabet‘s self-driving division, raised $16 billion. Together, those three rounds totaled $156 billion, representing 83% of the global venture capital raised in February.
A further four companies each raised $1 billion or more last month: Tokyo-based semiconductor manufacturer Rapidus; London-based self-driving platform Wayve; San Francisco-based AI for robotics World Labs; and Sunnyvale, California-based AI semiconductor company Cerebras Systems.
These massive rounds were led by strategic corporate investors, a host of private equity and alternative investors, as well as a few multistage venture investors and a government agency…
– “Massive AI Deals Drive $189B Startup Funding Record In February While Public Software Stocks Reel“
As Carlota Perez explains in Technological Revolutions and Financial Capital, we’re forever blowing bubbles…
* Neil Postman
###
As we contemplate change, we might send sanitary, odor-free birthday greetings to Sir Joseph William Bazalgette; he was born on this date in 1819. A civil engineer, he became chief engineer of London’s Metropolitan Board of Works, in which role his major achievement was a response to the “Great Stink of 1858,” in July and August of 1858, during which very hot weather exacerbated the smell of untreated human waste and industrial effluent. Bazalgette oversaw the creation of a sewer network for central London which addressed the problem– and was instrumental in relieving the city from cholera epidemics, in beginning the cleansing of the River Thames, and in creating (a crucial part of) the infrastructure that underlay its extraordinary growth over the next century.
“O brave new world, that has such people in ‘t!”*…
The estimable Steven Johnson suggests that the creation of Disney’s masterpiece, Snow White, gives us a preview of what may be coming with AI algorithms sophisticated enough to pass for sentient beings…
… You can make the argument that the single most dramatic acceleration point in the history of illusion occurred between the years of 1928 and 1937, the years between the release of Steamboat Willie [here], Disney’s breakthrough sound cartoon introducing Mickey Mouse, and the completion of his masterpiece, Snow White, the first long-form animated film in history [here— actually the first full-length animated feature produced in the U.S; the first produced anywhere in color]. It is hard to think of another stretch where the formal possibilities of an artistic medium expanded in such a dramatic fashion, in such a short amount of time.
[There follows an fascinating history of the Disney Studios technical innovations that made Snow White possible, and an account of the film;’s remarkable premiere…]
In just nine years, Disney and his team had transformed a quaint illusion—the dancing mouse is whistling!—into an expressive form so vivid and realistic that it could bring people to tears. Disney and his team had created the ultimate illusion: fictional characters created by hand, etched onto celluloid, and projected at twenty-four frames per second, that were somehow so believably human that it was almost impossible not to feel empathy for them.
Those weeping spectators at the Snow White premiere signaled a fundamental change in the relationship between human beings and the illusions concocted to amuse them. Complexity theorists have a term for this kind of change in physical systems: phase transitions. Alter one property of a system—lowering the temperature of a cloud of steam, for instance—and for a while the changes are linear: the steam gets steadily cooler. But then, at a certain threshold point, a fundamental shift happens: below 212 degrees Fahrenheit, the gas becomes liquid water. That moment marks the phase transition: not just cooler steam, but something altogether different.
…
It is possible—maybe even likely—that a further twist awaits us. When Charles Babbage encountered an automaton of a ballerina as a child in the early 1800s, the “irresistible eyes” of the mechanism convinced him that there was something lifelike in the machine. Those robotic facial expressions would seem laughable to a modern viewer, but animatronics has made a great deal of progress since then. There may well be a comparable threshold in simulated emotion—via robotics or digital animation, or even the text chat of an AI like LaMDA—that makes it near impossible for humans not to form emotional bonds with a simulated being. We knew the dwarfs in Snow White were not real, but we couldn’t keep ourselves from weeping for their lost princess in sympathy with them. Imagine a world populated by machines or digital simulations that fill our lives with comparable illusion, only this time the virtual beings are not following a storyboard sketched out in Disney’s studios, but instead responding to the twists and turns and unmet emotional needs of our own lives. (The brilliant Spike Jonze film Her imagined this scenario using only a voice.) There is likely to be the equivalent of a Turing Test for artificial emotional intelligence: a machine real enough to elicit an emotional attachment. It may well be that the first simulated intelligence to trigger that connection will be some kind of voice-only assistant, a descendant of software like Alexa or Siri—only these assistants will have such fluid conversational skills and growing knowledge of our own individual needs and habits that we will find ourselves compelled to think of them as more than machines, just as we were compelled to think of those first movie stars as more than just flickering lights on a fabric screen. Once we pass that threshold, a bizarre new world may open up, a world where our lives are accompanied by simulated friends…
Are we in for a phase-shift in our understanding of companionship? “Natural Magic,” from @stevenbjohnson, adapted from his book Wonderland: How Play Made The Modern World.
And for a different, but aposite perspective, from the ever-illuminating L. M. Sacasas (@LMSacasas), see “LaMDA, Lemoine, and the Allures of Digital Re-enchantment.”
* Shakespeare, The Tempest
###
As we rethink relationships, we might recall that it was on this date in 2007 that the original iPhone was introduced. Generally downplayed by traditional technology pundits after its announcement six months earlier, the iPhone was greeted by long lines of buyers around the country on that first day. Quickly becoming a phenomenon, one million iPhones were sold in only 74 days. Since those early days, the ensuing iPhone models have continued to set sales records and have radically changed not only the smartphone and technology industries, but the world in which they operate as well.
“Create more value than you capture”*…
A thoughtful consideration of Web 3.0 from the always-insightful Tim O’Reilly…
There’s been a lot of talk about Web3 lately, and as the person who defined “Web 2.0” 17 years ago, I’m often asked to comment. I’ve generally avoided doing so because most prognostications about the future turn out to be wrong. What we can do, though, is to ask ourselves questions that help us see more deeply into the present, the soil in which the future is rooted. As William Gibson famously said, “The future is already here. It’s just not evenly distributed yet.” We can also look at economic and social patterns and cycles, using as a lens the observation ascribed to Mark Twain that “history doesn’t repeat itself, but it rhymes.”
Using those filters, what can we say about Web3?…
There follows a fascinating– and educational– analysis of the state of play and the issues that we face.
Tim concludes…
Let’s focus on the parts of the Web3 vision that aren’t about easy riches, on solving hard problems in trust, identity, and decentralized finance. And above all, let’s focus on the interface between crypto and the real world that people live in, where, as Matthew Yglesias put it when talking about housing inequality, “a society becomes wealthy over time by accumulating a stock of long-lasting capital goods.” If, as Sal Delle Palme argues, Web3 heralds the birth of a new economic system, let’s make it one that increases true wealth—not just paper wealth for those lucky enough to get in early but actual life-changing goods and services that make life better for everyone.
“Why it’s too early to get excited about Web3,” from @timoreilly.
See also: “My first impressions of web3” from Matthew Rosenfeld (AKA Moxie Marlinspike, @moxie, founder of @signalapp).
* Tim O’Reilly
###
As we focus on first principles, we might recall that it was on this date in 2007 that Steve Jobs introduced the iPhone at MacWorld. The phone wasn’t available for sale until June 29th, occasioning one of the most heavily anticipated sales launches in the history of technology. Apple sold 1.4 million iPhones in 2007, steadily increasing each year; estimated sales in 2021 are 240-250 million.
“The Encyclopedia – the advance artillery of reason, the armada of philosophy, the siege engine of the enlightenment”…

Encyclopædia Britannica occupies a special place in the annals of publishing and the history of the West. Although its full influence, like that of any great work of literature, is ultimately immeasurable in concrete terms (the number of units sold is never the best barometer), its larger social and cultural impact—as a reference work, a spark to learning, a symbol of aspiration, a recorder of evolving knowledge, and a mirror of our changing times—has been extraordinary…
From George Bernard Shaw to Keith Richards, a few of Encyclopædia Britannica’s famous readers– and their fascinating tales– on the occasion of its 250th anniversary: “Encyclopedia Hounds.”
###
As we look it up, we might we might recall that it was on this date in 2007 that Apple released the first iPhone…. the device that ushered in the smartphone and that, with Wikipedia (which dates from 2001), contributed to the decline of Encyclopædia Britannica, which ceased print publication in 2012.
Are you sending a text, or are you just glad to see me?…
From Clusterflock, via the ever-illuminating Jason Kottke, “Meat Stylus for the iPhone“:

Sales of CJ Corporation’s snack sausages are on the increase in South Korea because of the cold weather; they are useful as a meat stylus for those who don’t want to take off their gloves to use their iPhones.
It seems that the sausages, electrostatically speaking, are close approximations of the human finger. Here’s the not-entirely-useful English translation of a Korean news article about the soaring sausage sales.
As we head directly for the refrigerated section of our grocery stores, we might recall that it was on this date in 1733 that James Oglethorpe founded that 13th of the original American Colonies– Georgia– and a settlement that has grown to become Savannah. February 12 is still observed as Georgia Day.
Oglethorpe’s idea was that British debtors should be released from prison and sent to the new colony. Ultimately, though, few debtors ended up in Georgia. Rather, colonists included many Scots and English tradesmen and artisans and religious refugees from Switzerland, France and Germany, as well as a number of Jewish refugees. The colony’s charter guaranteed the acceptance of all religions– except Roman Catholicism, a ban based on fears born of the colony’s proximity to the hostile settlements in Spanish Florida.
Oglethorpe also arranged that slavery should be banned by Georgia’s Royal Charter; and the colony was slavery-free through 1750 (after Oglethorpe’s departure back to England). At that point, the Crown acceded to land owners’ desire for a larger work force, and lifted the ban.







You must be logged in to post a comment.