(Roughly) Daily

Posts Tagged ‘apple

“There are two times in a man’s life when he shouldn’t speculate: when he can afford to and when he can’t.”*…

Robinhood, the trading platform supposedly meant “to democratize finance for all”: not all change is progress; not all “disruption” is for the good…

… What is Robinhood?

The company operates a mobile app that enables consumers to trade stocks, options, and crypto. These orders are the company’s inventory, which it sells to “market makers” — large financial institutions that pare (execute) the trades in the market. As with Google or Facebook, Robinhood’s users are not its customers, but its supply.

This means Robinhood is incentivized to keep its users trading … a lot. The goal: make stock trading as addictive as social media scrolling. RH has enjoyed success here. The proportion of users who check it daily rivals those of Twitter, Snapchat, and Facebook.

The transaction at the heart of the company’s model is “Payment for Order Flow” or PFOF. Because RH generates its revenue by selling orders to market makers, it doesn’t charge commissions to its consumer users. But this also creates a conflict of interest for the company, which is motivated to sell orders to the market maker that offers the highest payment for the trade rather than the best price. It’s like affiliate marketing, but for your financial future.

PFOF goes back to the 1980s, when it was pioneered by, wait for it … Bernie Madoff. Madoff relied on the practice to make his firm one of the leading market makers of its day, and when regulators raised questions about whether it presented a conflict of interest, he used his position as the chairperson of Nasdaq to prevent restrictions. (PFOF is illegal in the U.K.) There was no conflict of interest, Madoff assured his colleagues, because “there are very strict rules that I would assume most firms comply with.”

Robinhood is the latest example of an increasing trend: tech companies for whom illegality is a feature, not a bug. Uber is an $86 billion gypsy cab company. Facebook and Google have received so many fines, it’s likely the companies internally classify them as a cost of doing business. This is tantamount to replacing civics courses with prison training, because … well … that’s how we roll.

For its part, RH has racked up: a $70 million settlement with FINRA, a $65 million SEC fine (for failing to properly disclose PFOF), and a separate $1.25 million FINRA fine. And on Wednesday, on the eve of pricing its IPO, the company disclosed that its senior executives are under investigation by FINRA for failing to acquire broker-dealer licenses. In addition, another inquiry is under way into the possibility that RH employees made illegal insider trades during the GameStop frenzy early this year.

Once, that type of disclosure would have dismembered an IPO. Instead, 48 hours after it made the disclosure, Robinhood was publicly trading at $32 billion. Telling point: The company paid its chief legal officer, Daniel Gallagher, more than $30 million in 2020, even though it hired him halfway through the year. From 2011 to 2015, Gallagher was an SEC Commissioner. Our business environment has morphed from capitalism, which depends on the rules of fair play, into cronyism.

Flouting the law is now a signal to investors that a firm is “disruptive.” Established companies, which believe they have too much to lose, have spent years investing in a culture of compliance to protect themselves. Disrupters, with access to cheap capital and few legacy assets, have no such constraints. In Robinhood’s case, no less an establishment bulwark than Goldman Sachs has blessed its approach to business by taking the lead on the company’s IPO. Forget orange — criminality without consequence is the new black.

In practice, Robinhood’s activities look more like the dispersion of financial risk than the “democratization of finance” — kind of like if a for-profit prison claimed to be “democratizing housing.” As both an app and as an investment, RH makes more sense in the context of gambling than investing. Its business model depends on active traders, but research shows the more active traders are, the more money they lose. Likewise, the casino isn’t making much off the blackjack player who sits at the $5 table cadging free drinks, but it hopes the lure of easy money (and the lubrication of those free drinks) will loosen his pockets eventually.

Greater gambling access is becoming a trend. The illegal sports betting market, estimated at $150 billion a year, is rapidly moving to legal online forums. You can now place a sports bet from your couch in 20 states and counting, and mobile gambling apps are reaping the rewards. Since its SPAC listing in April 2020, DraftKings’ stock is up 160%. I don’t have a problem with this, as these firms state what they’re made for: gambling.

Another market that’s benefited from our insatiable appetite for risk? Crypto. Robinhood caught that trend early and introduced crypto trading to its platform in February 2018. Since then, the global crypto market has grown from $450 billion to $1.9 trillion. In the first three months of 2021, 6% of RH’s revenue came from Dogecoin trades. If that sounds like an unstable business model, trust your instincts.

Here’s what we’re saddled with: A trend of companies that prey on our financial naiveté, with no regard for law or morality and infinite amounts of capital. What can we do?

First, it’s long past time for the rule of law to reassert itself. Five years ago, admissions to elite universities were awash in bribery and fraud. Then the feds put some wealthy lawyers, investors, and television stars in jail. Did it work? I’d venture that if any parent receives an offer of a “side door” for their kid to get into an elite university today, the parent hangs up, crisply.

Second, we need to arm ourselves, and particularly our young people, with financial literacy. Everyone should be fluent in the basics of markets and how to build financial security. My NYU colleague Aswath Damodaran believes the best regulation is life lessons. Perhaps basic lessons in finance (e.g., not to trade on an app that harvests its orders for revenue) would lessen the pain of these lessons. If we can offer computer science and Mandarin in schools, we should offer courses in financial literacy. The English-as-a-second language course in any capitalist society ought to be in money.

We’ve implemented policies in the U.S. that have resulted in a halving of the wealth of Americans under the age of 40 (as a percentage of household wealth) over the past three decades. With so much less to lose, today’s young Americans are justifiably looking for new asset classes and embracing volatility. Put another way, there is cause for a rebellion. The food industrial complex wants you to be fat, social media wants you to be divided, and RH wants you to believe you can get rich quick by day trading. Rebel.

When the democratization of finance isn’t: “$HOOD.” Scott Galloway (@profgalloway) on the dangerously disingenuous Robinhood.

* Mark Twain

###

As we reconcile ourselves to the fact that if it seems to be too good to be true, it is, we might recall that it was on this date in 2018 that Apple became the first U.S.-based company with a $1 trillion market cap. Shares of Apple rose 2.9% on the day, closing at $207.39, giving the company a $1.002 trillion valuation. Shares of Apple’s stock were up about 40,000% since Apple computer’s IPO on December 12th, 1980.

Amazon broke the $1 trillion milestone a month later on September 4th, 2018. Microsoft reached the milestone nearly a year later, on April 25th, 2019.

source

Written by (Roughly) Daily

August 2, 2021 at 1:00 am

“Patents need inventors more than inventors need patents”*…

 

patent-toiletpaper

 

Patents for invention — temporary monopolies on the use of new technologies — are frequently cited as a key contributor to the British Industrial Revolution. But where did they come from? We typically talk about them as formal institutions, imposed from above by supposedly wise rulers. But their origins, or at least their introduction to England, tell a very different story…

How the 15th century city guilds of Italy paved the way for the creation of patents and intellectual property as we know it: “Age of Invention: The Origin of Patents.”

(Image above: source)

* Kalyan C. Kankanala, Fun IP, Fundamentals of Intellectual Property

###

As we ruminate on rights, we might recall that it was on this date in 1981 that IBM introduced the IBM Personal Computer, commonly known as the IBM PC, the original version of the IBM PC compatible computer design… a relevant descriptor, as the IBM PC was based on open architecture, and third-party suppliers soon developed to provide peripheral devices, expansion cards, software, and ultimately, IBM compatible computers.  While IBM has gone out of the PC business, it had a substantial influence on the market in standardizing a design for personal computers; “IBM compatible” became an important criterion for sales growth.  Only Apple has been able to develop a significant share of the microcomputer market without compatibility with the IBM architecture (and what it has become).

300px-Bundesarchiv_B_145_Bild-F077948-0006,_Jugend-Computerschule_mit_IBM-PC source

 

“The future is there… looking back at us. Trying to make sense of the fiction we will have become”*…

 

Octavia Butler

 

Tim Maughan, an accomplished science fiction writer himself, considers sci-fi works from the 1980s and 90s, and their predictive power.  Covering Bruce Sterling, William Gibson, Rudy Rucker, Steven King, P.D. James, an episode of Star Trek: Deep Space Nine, and Bladerunner, he reserves special attention for a most deserving subject…

When you imagine the future, what’s the first date that comes into your mind? 2050? 2070? The year that pops into your head is almost certainly related to how old you are — some point within our lifetimes yet distant enough to be mysterious, still just outside our grasp. For those of us growing up in the 1980s and ’90s — and for a large number of science fiction writers working in those decades — the 2020s felt like that future. A decade we would presumably live to see but also seemed sufficiently far away that it could be a world full of new technologies, social movements, or political changes. A dystopia or a utopia; a world both alien and familiar.

That future is, of course, now…

Two science fiction books set in the 2020s tower over everything else from that era in their terrifying prescience: Octavia Butler’s Parable of the Sower (1993) and Parable of the Talents (1998). These books by the late master kick off in 2024 Los Angeles and are set against a backdrop of a California that’s been ravaged by floods, storms, and droughts brought on by climate change. Middle- and working-class families huddle together in gated communities, attempting to escape the outside world through addictive pharmaceuticals and virtual reality headsets. New religions and conspiracy theory–chasing cults begin to emerge. A caravan of refugees head north to escape the ecological and social collapse, while a far-right extremist president backed by evangelical Christians comes to power using the chillingly familiar election slogan Make America Great Again.

Although it now feels like much of Butler’s Parable books might have been pulled straight from this afternoon’s Twitter or tonight’s evening news, some elements are more far-fetched. The second book ends with followers of the new religion founded by the central character leaving Earth in a spaceship to colonize Alpha Centauri. Butler originally planned to write a third book following the fates of these interstellar explorers but, sadly, passed away in 2005 before she had a chance. She left us with a duology that remains more grounded and scarily familiar to those of us struggling to come to terms with the everyday dystopias that the real 2020s seem to be already presenting us.

Not that this remarkable accuracy was ever her objective.

“This was not a book about prophecy; this was an if-this-goes-on story,” Butler said about the books during a talk at MIT in 1998. “This was a cautionary tale, although people have told me it was prophecy. All I have to say to that is I certainly hope not.”

In the same talk, Butler describes in detail the fears that drove her to write this warning: the debate over climate change, the eroding of workers’ rights, the rise of the private prison industry, and the media’s increasing refusal to talk about all of these in favor of focusing on soundbite propaganda and celebrity news. Again, these are fears that feel instantly familiar today…

What Blade Runner, cyberpunk– and Octavia Butler– had to say about the age we’re entering now: “How Science Fiction Imagined the 2020s.”

* William Gibson, Pattern Recognition

###

As we honor prophets, we might recall that it was on this date in 1984 that Apple aired an epoch-making commercial, “1984” (directed by Blade Runner director Ridley Scott),  during Superbowl XVIII– for the first and only time.  Two days later, the first Apple Macintosh went on sale.

 

Written by (Roughly) Daily

January 22, 2020 at 1:01 am

“When you have seen one ant, one bird, one tree, you have not seen them all”*…

 

tree2

 

In fact, you may not have seen them– really seen them– at all…

Photographs of trees whose trunk and all visible branches have been removed by computer. Only the explosive power of the leaves remains, like fireworks in broad daylight. Through the process of retouching images, I sought to extract by subtraction this explosiveness, this will of life which participates in the majestuousness of the plant world but which is sometimes veiled by our habits of perception...

tree1

tree3

 

Visual artist and photographer Hugo Livet on his series “Fireworks” (“Feu d’artifice”).  More at his site.

* E. O. Wilson

###

As we commune with Kilmer, we might recall that it was on this date in 1307 that Wilhelm Tell (or we Anglos tend to know him, William Tell) shot an apple off his son’s head.

Tell, originally from Bürglen, was a resident of the Canton of Uri (in what is now Switzerland), well known as an expert marksman with the crossbow. At the time, the Habsburg emperors of Austria were seeking to dominate Uri.  Hermann Gessler, the newly appointed Austrian Vogt (the Holy Roman Empire’s title for “overlord”) of Altdorf, raised a pole in the village’s central square, hung his hat on top of it, and demanded that all the local townsfolk bow before the hat.  When Tell passed by the hat without bowing, he was arrested; his punishment was being forced to shoot an apple off the head of his son, Walter– or else both would be executed. Tell was promised freedom if he succeeded.

As the legend has it, Tell split the fruit with a single bolt from his crossbow.  When Gessler queried him about the purpose of a second bolt in his quiver, Tell answered that if he had killed his son, he would have turned the crossbow on Gessler himself.  Gessler became enraged at that comment, and had Tell bound and brought to his ship to be taken to his castle at Küssnacht.  But when a storm broke on Lake Lucerne, Tell managed to escape.  On land, he went to Küssnacht, and when Gessler arrived, Tell shot him with his crossbow.

Tell’s defiance of Gessler sparked a rebellion, in which Tell himself played a major part, leading to the formation of the Swiss Confederation.

Tell and his son

 

 

Written by (Roughly) Daily

November 18, 2019 at 1:01 am

“By far, the greatest danger of Artificial Intelligence is that people conclude too early that they understand it”*…

 

robit writer

 

Recently, OpenAI announced its latest breakthrough, GPT-2, a language model that can write essays to a prompt, answer questions, and summarize longer works… sufficiently successfully that OpenAI has said that it’s too dangerous to release the code (lest it result in “deepfake news” or other misleading mischief).

Scott Alexander contemplates the results.  His conclusion:

a brain running at 5% capacity is about as good as the best AI that the brightest geniuses working in the best-equipped laboratories in the greatest country in the world are able to produce in 2019. But:

We believe this project is the first step in the direction of developing large NLP systems without task-specific training data. That is, we are developing a machine language system in the generative style with no explicit rules for producing text. We hope for future collaborations between computer scientists, linguists, and machine learning researchers.

A boring sentiment from an interesting source: the AI wrote that when asked to describe itself. We live in interesting times.

His complete post, eminently worthy of reading in full: “Do Neural Nets Dream of Electric Hobbits?

[image above, and another account of OpenAI’s creation: “OpenAI says its new robo-writer is too dangerous for public release“]

* Eliezer Yudkowsky

###

As we take the Turing Test, we might send elegantly-designed birthday greetings to Steve Jobs; he was born on this date in 1955.  While he is surely well-known to every reader here, let us note for the record that he was was instrumental in developing the Macintosh, the computer that took Apple to unprecedented levels of success.  After leaving the company he started with Steve Wozniak, Jobs continued his personal computer development at his NeXT Inc.  In 1997, Jobs returned to Apple to lead the company into a new era based on NeXT technologies and consumer electronics.  Some of Jobs’ achievements in this new era include the iMac, the iPhone, the iTunes music store, the iPod, and the iPad.  Under Jobs’ leadership Apple was at one time the world’s most valuable company. (And, of course, he bought Pixar from George Lucas, and oversaw both its rise to animation dominance and its sale to Disney– as a product of which Jobs became Disney’s largest single shareholder.)

Jobs source

 

Written by (Roughly) Daily

February 24, 2019 at 1:01 am

%d bloggers like this: