Posts Tagged ‘apple’
“Other things stop working or they break, But batteries… they die”*…
There’s been a great deal of talk about semiconductors and the implications of a supply chain that depends too heavily on China– and some action (c.f., e.g., here and here). Fair enough: chips are clearly central to the economy into which we’re growing; assuring access matters. But let us not forget their humble technological cousin, the battery. Batteries power an increasing number of the appliances on which our lives increasingly depend; with the world gearing up for the electric vehicle era, we’re going to need them even more. So it could be an issue that the world is much more reliant on China for batteries than for chips…
Battery manufacturing has become a priority for many nations, including the United States. However, having entered the race for batteries early, China is far and away in the lead… In 2022, China had more battery production capacity than the rest of the world combined…
Global lithium-ion manufacturing capacity is projected to increase eightfold in the next five years… China’s well-established advantage is set to continue through 2027, with 69% of the world’s battery manufacturing capacity…
Battery manufacturing is just one piece of the puzzle, albeit a major one. Most of the parts and metals that make up a battery—like battery-grade lithium, electrolytes, separators, cathodes, and anodes—are primarily made in China.
Therefore, combating China’s dominance will be expensive. According to Bloomberg, the U.S. and Europe will have to invest $87 billion and $102 billion, respectively, to meet domestic battery demand with fully local supply chains by 2030…
More (and a larger version of the graphic above) at “Visualizing China’s Dominance in Battery Manufacturing (2022-2027P),” from @VisualCap.
* Demetri Martin
###
As we recharge, we might recall that it was on this date in 1984 that Apple aired an epoch-making commercial, “1984” (directed by Blade Runner director Ridley Scott), during Superbowl XVIII– for the first and only time. Two days later, the first Apple Macintosh went on sale…. battery-dependent portables followed a few years later.
“Create more value than you capture”*…
A thoughtful consideration of Web 3.0 from the always-insightful Tim O’Reilly…
There’s been a lot of talk about Web3 lately, and as the person who defined “Web 2.0” 17 years ago, I’m often asked to comment. I’ve generally avoided doing so because most prognostications about the future turn out to be wrong. What we can do, though, is to ask ourselves questions that help us see more deeply into the present, the soil in which the future is rooted. As William Gibson famously said, “The future is already here. It’s just not evenly distributed yet.” We can also look at economic and social patterns and cycles, using as a lens the observation ascribed to Mark Twain that “history doesn’t repeat itself, but it rhymes.”
Using those filters, what can we say about Web3?…
There follows a fascinating– and educational– analysis of the state of play and the issues that we face.
Tim concludes…
Let’s focus on the parts of the Web3 vision that aren’t about easy riches, on solving hard problems in trust, identity, and decentralized finance. And above all, let’s focus on the interface between crypto and the real world that people live in, where, as Matthew Yglesias put it when talking about housing inequality, “a society becomes wealthy over time by accumulating a stock of long-lasting capital goods.” If, as Sal Delle Palme argues, Web3 heralds the birth of a new economic system, let’s make it one that increases true wealth—not just paper wealth for those lucky enough to get in early but actual life-changing goods and services that make life better for everyone.
“Why it’s too early to get excited about Web3,” from @timoreilly.
See also: “My first impressions of web3” from Matthew Rosenfeld (AKA Moxie Marlinspike, @moxie, founder of @signalapp).
* Tim O’Reilly
###
As we focus on first principles, we might recall that it was on this date in 2007 that Steve Jobs introduced the iPhone at MacWorld. The phone wasn’t available for sale until June 29th, occasioning one of the most heavily anticipated sales launches in the history of technology. Apple sold 1.4 million iPhones in 2007, steadily increasing each year; estimated sales in 2021 are 240-250 million.
“Patents need inventors more than inventors need patents”*…

Patents for invention — temporary monopolies on the use of new technologies — are frequently cited as a key contributor to the British Industrial Revolution. But where did they come from? We typically talk about them as formal institutions, imposed from above by supposedly wise rulers. But their origins, or at least their introduction to England, tell a very different story…
How the 15th century city guilds of Italy paved the way for the creation of patents and intellectual property as we know it: “Age of Invention: The Origin of Patents.”
(Image above: source)
* Fun IP, Fundamentals of Intellectual Property
###
As we ruminate on rights, we might recall that it was on this date in 1981 that IBM introduced the IBM Personal Computer, commonly known as the IBM PC, the original version of the IBM PC compatible computer design… a relevant descriptor, as the IBM PC was based on open architecture, and third-party suppliers soon developed to provide peripheral devices, expansion cards, software, and ultimately, IBM compatible computers. While IBM has gone out of the PC business, it had a substantial influence on the market in standardizing a design for personal computers; “IBM compatible” became an important criterion for sales growth. Only Apple has been able to develop a significant share of the microcomputer market without compatibility with the IBM architecture (and what it has become).
“The future is there… looking back at us. Trying to make sense of the fiction we will have become”*…

Tim Maughan, an accomplished science fiction writer himself, considers sci-fi works from the 1980s and 90s, and their predictive power. Covering Bruce Sterling, William Gibson, Rudy Rucker, Steven King, P.D. James, an episode of Star Trek: Deep Space Nine, and Bladerunner, he reserves special attention for a most deserving subject…
When you imagine the future, what’s the first date that comes into your mind? 2050? 2070? The year that pops into your head is almost certainly related to how old you are — some point within our lifetimes yet distant enough to be mysterious, still just outside our grasp. For those of us growing up in the 1980s and ’90s — and for a large number of science fiction writers working in those decades — the 2020s felt like that future. A decade we would presumably live to see but also seemed sufficiently far away that it could be a world full of new technologies, social movements, or political changes. A dystopia or a utopia; a world both alien and familiar.
That future is, of course, now…
Two science fiction books set in the 2020s tower over everything else from that era in their terrifying prescience: Octavia Butler’s Parable of the Sower (1993) and Parable of the Talents (1998). These books by the late master kick off in 2024 Los Angeles and are set against a backdrop of a California that’s been ravaged by floods, storms, and droughts brought on by climate change. Middle- and working-class families huddle together in gated communities, attempting to escape the outside world through addictive pharmaceuticals and virtual reality headsets. New religions and conspiracy theory–chasing cults begin to emerge. A caravan of refugees head north to escape the ecological and social collapse, while a far-right extremist president backed by evangelical Christians comes to power using the chillingly familiar election slogan Make America Great Again.
Although it now feels like much of Butler’s Parable books might have been pulled straight from this afternoon’s Twitter or tonight’s evening news, some elements are more far-fetched. The second book ends with followers of the new religion founded by the central character leaving Earth in a spaceship to colonize Alpha Centauri. Butler originally planned to write a third book following the fates of these interstellar explorers but, sadly, passed away in 2005 before she had a chance. She left us with a duology that remains more grounded and scarily familiar to those of us struggling to come to terms with the everyday dystopias that the real 2020s seem to be already presenting us.
Not that this remarkable accuracy was ever her objective.
“This was not a book about prophecy; this was an if-this-goes-on story,” Butler said about the books during a talk at MIT in 1998. “This was a cautionary tale, although people have told me it was prophecy. All I have to say to that is I certainly hope not.”
In the same talk, Butler describes in detail the fears that drove her to write this warning: the debate over climate change, the eroding of workers’ rights, the rise of the private prison industry, and the media’s increasing refusal to talk about all of these in favor of focusing on soundbite propaganda and celebrity news. Again, these are fears that feel instantly familiar today…
What Blade Runner, cyberpunk– and Octavia Butler– had to say about the age we’re entering now: “How Science Fiction Imagined the 2020s.”
* William Gibson, Pattern Recognition
###
As we honor prophets, we might recall that it was on this date in 1984 that Apple aired an epoch-making commercial, “1984” (directed by Blade Runner director Ridley Scott), during Superbowl XVIII– for the first and only time. Two days later, the first Apple Macintosh went on sale.





You must be logged in to post a comment.