(Roughly) Daily

Posts Tagged ‘economics

“People in any organization are always attached to the obsolete – the things that should have worked but did not, the things that once were productive and no longer are”*…

Ed Zitron argues that America has too many managers, and managers misbehaving at that…

In a 2016 Harvard Business Review analysis, two writers calculated the annual cost of excess corporate bureaucracy as about $3 trillion, with an average of one manager per every 4.7 workers. Their story mentioned several case studies—a successful GE plant with 300 technicians and a single supervisor, a Swedish bank with 12,000 workers and three levels of hierarchy—that showed that reducing the number of managers usually led to more productivity and profit. And yet, at the time of the story, 17.6 percent of the U.S. workforce (and 30 percent of the workforce’s compensation) was made up of managers and administrators—an alarming statistic that shows how bloated America’s management ranks had become.

The United States, more than anywhere else in the world, is addicted to the concept of management. As I’ve written before, management has become a title rather than a discipline. We have a glut of people in management who were never evaluated on their ability to manage before being promoted to their role. We have built corporate America around the idea that if you work hard enough, one day you might become a manager, someone who makes rather than takes orders. While this is not the only form of management, based on the response to my previous article and my newsletters on the subject, this appears to be how many white-collar employees feel. Across disparate industries, an overwhelming portion of management personnel is focused more on taking credit and placing blame rather than actually managing people, with dire consequences.

This type of “hall monitor” management, as a practice, is extremely difficult to execute remotely, and thus the coming shift toward permanent all- or part-remote work will lead to a dramatic rethinking of corporate structure. Many office workers—particularly those in industries that rely on the skill or creativity of day-to-day employees—are entering a new world where bureaucracy will be reduced not because executives have magically become empathetic during the pandemic, but because slowing down progress is bad business. In my eyes, that looks like a world in which the power dynamics of the office are inverted. With large swaths of people working from home some or all of the time, managers will be assessed not on their ability to intimidate other people into doing things, but on their ability to provide their workers with the tools they need to measurably succeed at their job.

In order to survive, managers, in other words, will need to start proving that they actually do something. What makes this shift all the more complicated is that many 21st-century, white-collar employees don’t necessarily need a hands-on manager to make sure they get their work done…

The pandemic has laid bare that corporate America disrespects entry-level workers. At many large companies, the early years of your career are a proving ground with little mentorship and training. Too many companies hand out enormous sums to poach people trained elsewhere, while ignoring the way that the best sports teams tend to develop stars—by taking young, energetic people and investing in their future (“trust the process,” etc.). This goes beyond investing in education and courses; it involves taking rising stars in your profession and working to make them as good as your top performer.

In a mostly remote world, a strong manager is someone who gets the best out of the people they’re managing, and sees the forest from the trees—directing workers in a way that’s informed by both experience and respect. Unfortunately, the traditional worker-to-manager pipeline often sets people up for inefficiency and failure. It’s the equivalent of taking a pitcher in their prime and making them a coach—being good at one thing doesn’t mean you can make other people good at the same thing. This is known as the Peter principle, a management concept developed by Laurence J. Peter in the late ’60s that posits that a person who’s good at their job in a hierarchical organization will invariably be promoted to a position that requires different skills, until they’re eventually promoted to something they can’t do, at which point they’ve reached their “maximum incompetence.” Consistent evidence shows that the principle is real: A study of sales workers at 214 firms by the National Bureau of Economic Research found that firms prioritize current job performance in promotion decisions over whether the person can actually do the job for which they’re being considered. In doing so, they’re placing higher value on offering the incentive of promotion to get more out of their workers, at the cost of potentially injecting bad management into their organization.

What I’m talking about here is a fundamental shift in how we view talent in the workplace. Usually, when someone is good at their job, they are given a soft remit to mentor people, but rarely is that formalized into something that is mutually beneficial. A lack of focus on fostering talent is counterintuitive, and likely based on a level of fear that one could train one’s own replacement, or that a business could foster its own competition. This is a problem that could be solved by paying people more money for being better at their job. Growing talent is also a more sustainable form of business—one that harkens back to the days of apprenticeships—where you’re fostering and locking up talent so that it doesn’t go elsewhere, and doesn’t cost you time and money to have to recruit it (or onboard it, which costs, on average, more than $4,000 a person). Philosophically, it changes organizations from a defensive position (having to recruit to keep up) to an offensive position (building an organization from within), and also greatly expands an organization’s ability to scale affordably…

The problem is that modern American capitalism has equated “getting the most out of someone” with “getting the most hours out of them,” rather than getting the most value out of them. “Success,” as I’ve discussed before, is worryingly disconnected from actually succeeding in business.

Reducing bureaucracy is also a net positive for the labor market, especially for young people. Entry-level corporate work is extremely competitive and painful, a years-long process in which you’re finding your footing in an industry and an organization. If we can change the lens through which we view those new to the workforce—as the potential hotshots of the future, rather than people who have to prove themselves—we’ll have stronger organizations that waste less money. We should be trying to distill and export the talents of our best performers, and give them what they need to keep doing great things for our companies while also making their colleagues better too.

All of this seems inevitable, to me, because a remote future naturally reconfigures the scaffolding of how work is done and how workers are organized. The internet makes the world a much smaller place, which means that simple things such as keeping people on task don’t justify an entire position—but mentorship and coaching that can get the best out of each worker do.

Hopefully we can move beyond management as a means of control, and toward a culture that appreciates a manager who fosters and grows the greatness in others.

The pandemic has exposed a fundamental weakness in the system: “Say Goodbye to Your Manager,” from @edzitron.

* Peter Drucker

###

As we reorganize, we might recall that it was on this date that Henri Giffard made the first first powered and controlled flight of an airship, traveling 27 km from Paris to Élancourt in his “Giffard dirigible.”

Airships were the first aircraft capable of controlled powered flight, and were most commonly used before the 1940s, largely floated with (highly-flammable) hydrogen gas. Their use decreased as their capabilities were surpassed by those of airplanes- and then plummeted after a series of high-profile accidents, including the 1930 crash and burning of the British R101 in France, the 1933 and 1935 storm-related crashes of the twin airborne aircraft carrier U.S. Navy helium-filled rigids, the USS Akron and USS Macon respectively, and– most famously– the 1937 burning of the German hydrogen-filled Hindenburg.

The Giffard dirigible [source]

Written by (Roughly) Daily

September 24, 2021 at 1:00 am

“The old world is dying, and the new world struggles to be born; now is the time of monsters”*…

Historian Adam Tooze‘s new book, Shutdown: How Covid Shook the World’s Economy, is released today. You can (and, I’d suggest, should) read excerpts from its introduction in The Guardian and The New York Times.

In his newsletter, he unpacks the fundamental historiographic challenge that he encountered in writing it, why that challenge matters… and why we must all face (and face up to) it:

I generally prefer a narrative mode that plunges you in to the middle of things, rather than beginning at the beginning. The in medias res approach is more engaging. It catches the reader’s attention from the start because they have to scramble to orientate themselves. It is also more transparent in its artifice. I prefer the deliberate and obvious break in the linear flow produced by a flashback – “now we interrupt the action to explain something you really need to know” – to the apparent simplicity and calm of “beginning at the beginning”, which in its own way begs all the same questions, but smuggles the answers into the smooth flow of a linear narrative.

As [critic Perry] Anderson suggested [here], this stylistic preference also reflects a certain understanding of politics and agency and their relationship to history, which might broadly be described as Keynesian left-liberalism. As he puts it, “a ‘situational and tactical’ approach to the subject in hand determines entry to” the subject matter “in medias res”. It mirrors my preoccupation with “pragmatic crisis management in the form of punctual adjustments without illusion of permanency”.

I side with those who see “in medias res”, not just as a stylistic choice and a mode of historical and political analysis, but as defining the human condition – apologies for the boldness of that claim. Being thrown into pre-given situations define us, whether though social structure, language, concepts, identities or chains of action and interaction, in which we are willy nilly enrolled and to which we ourselves contribute, thereby enrolling others as well.

Whatever thinking or writing we do, however we choose to couch it and whatever our explanatory ambition, we do it from the midst of things, not from above or beyond the fray. There are different ways of articulating that relationship – more remote or more immediate – but no way out of that situatedness.

We are thrown into situations. Most of the time they don’t come with instructions. If they do come with instructions we should probably not trust them. We have to perform enquiries to figure out how we got here, what our options are and where we might be headed. To do the work of figuring out our situation we might resort to the tools of social science, like statistics or economic concepts. Political theory may help. But history writing too is part of the effort at rendering our situations more intelligible.

For some colleagues, history is distinctive because it studies the distant past, or because it takes the archive as its source. For me, self-consciously inhabiting our situatedness in time is what differentiates historical enquiry and writing from other forms of social knowledge. History is the attempt to produce knowledge of the flux from within the flux. As Croce remarks: “All true history is contemporary history.”

The speed, intensity and generality of the COVID pandemic and the cognitive challenges it posed, gave this entanglement a new intensity. Even at the best of times, however, the problem is that being in medias res it is easier said than done. It is both inescapable and, at the same time, mysterious.

We are in medias res you say? In the middle of things? But which things? And how do those things relate to us and define us? Who or what are we in relation to these things? How do we chart the middle of this world? Who has the map? Who has the compass?…

@adam_tooze goes on to propose if not concrete answers to those questions, then a approach that can keep one honest. Eminently worth reading in full. History in the thick of it: “Writing in medias res.”

*  Antonio Gramsci, Prison Notebooks

###

As we ponder perspective, we might spare a thought for Alan John Percivale (A. J. P.) Taylor; he died on this date in 1990. A historian, he wrote (albeit not overtly in media res) and taught briefly at Manchester Uinversity, then for most of his career at Oxford, focused largely on 19th- and 20th-century European diplomacy. But he gained a popular audience of millions via his journalism and broadcast lectures. His combination of academic rigor and popular appeal led the historian Richard Overy to describe him as “the Macaulay of our age.”

source

Written by (Roughly) Daily

September 7, 2021 at 1:00 am

“Symmetry is not the way of the world in all times and places”*…

What a difference a couple of decades make…

Asymmetries are back. Rising market power, the sudden ubiquity of global digital networks, hierarchical hub-and-spoke structures in international trade and finance and the enduring dominance of the US dollar, despite the transition to floating exchange rates, all point to their resurgence. The remarkable decay of economic multilateralism in the very fields – trade and development finance – where global rules and institutions were first tried and reigned supreme for decades, is paving the way to a redefinition of international relations on a bilateral or regional basis, with powerful countries setting their own rules of the game. This transformation is compounded by the strengthening of geopolitical rivalry between the US, China and a handful of second-tier powers.

Donald Trump’s attempt to leverage US centrality in the global economy to extract rents from economic partners was short-lived. But US policy has certainly changed permanently. For all its friendly intentions, the Biden administration leaves no doubt about its overriding priorities: a foreign policy for the (domestic) middle class – to quote the title of a recent report (Ahmed et al, 2020) – and the preservation of the US edge over China. China, for its part, has set itself the goal of becoming by 2049 a “fully developed, rich and power-ful” nation and does not show any intention to play by multi-lateral rules that were conceived by others. In this context, the rapid escalation of great power competition between Washington and Beijing is driving both rivals towards the building of competing systems of bilateral or regional arrangements.

What is emerging is not only an asymmetric hub-and-spoke landscape. It is a world in which hubs are controlled by major geopolitical powers – in other words, a multipolar, fragmented world. Nothing indicates that these asymmetries will fade away any time soon. On the contrary, economic, systemic and geopolitical factors all suggest they may prove persistent. We will have to learn to live with them.

There are several consequences. First, this new context calls for an analytical reassessment. Recent research has put the spotlight on a series of economic, financial or monetary asymmetries and has begun to uncover their determinants and effects. Analytical and empirical tools are available that make it possible to gather systematic evidence and to document the impact of asymmetries on the distribution of the gains from economic interdependence. We are on our way to learning more about the welfare and the policy implications of participating in an increasingly asymmetric global system.

Second, the relationship between economics and geopolitics must now be looked at in a more systematic way. For many years – even before the demise of the Soviet Union – international economic relations were considered in isolation, at least by economists. They were looked at as if they were (mostly) immune from geopolitical tensions. This stance is no longer tenable, at a time when great-power rivalry is reasserting itself as a key determinant of policy decisions. Whatever their wishes, economists have no choice but to respond to this new reality. They should document the potential for coercion by powers in control of crucial nodes or infrastructures and the risks involved in participating in the global economy from a vulnerable position.

Third, supporters of multilateralism need to wake up to the new context. They have too often championed a world made up of peaceful and balanced relations that bears limited resemblance to reality. Because power and asymmetry can only be forgotten at one’s own risk, neglecting them inevitably fuels mistrust of principles, rules and institutions that are perceived as biased. Multilateralism remains essential, but institutions are not immune to the risk of capture.

Asymmetry, however, does not imply a change of paradigm. Even if it affects the distribution of gains from trade, it does not abolish them. And in a world in which global public goods (and bads) have moved to the forefront of the policy agenda, there is no alternative to cooperation and institutionalised collective action. The prevention of climate-related disasters, maintenance of public health and preservation of biodiversity will remain vital tasks whatever the state of inter-national relations. What asymmetries call for is an adaptation of policy template. The multilateral project should not be ditched, but it must be rooted in reality.

Understanding the emerging new global economy: from the conclusion of Jean Pisani-Ferry‘s (@pisaniferry) paper, “Global Assymetries Strike Back,” eminently worth reading in full. [Via @adam_tooze]

*  Charles Kindelberger, economic historian and architect of the Marshall Plan

###

As we find our place, we might send tight birthday greetings to Paul Adolph Volcker Jr.; he was born on this date in 1927. An economist, he was appointed Federal Reserve Chair by President Carter in 1979, and reappointed by President Reagan. He took that office in a time of “stagflation” in the U.S.; his tight money policies, combined with Reagan’s expansive fiscal policy(large tax cuts and a major increase in military spending), tamed inflation, but led to much larger federal deficits (and thus, higher federal interest costs) and increased economic imbalances across the economy. In the end, Reagan let Volcker go; as Joseph Stiglitz observed, “Paul Volcker… known for keeping inflation under control, was fired because the Reagan administration didn’t believe he was an adequate de-regulator.”

Volcker returned to government service in 2009 as the chairman of President Obama’s Economic Recovery Advisory Board. In 2010, Obama proposed bank regulations which he dubbed “The Volcker Rule,” which would prevent commercial banks from owning and investing in hedge funds and private equity, and limit the trading they do for their own accounts (a reprise of a key element in the then-defunct Glass-Steagell Act). It was enacted; but in 2020, FDIC officials said the agency would loosen the restrictions of the Volcker Rule, allowing banks to more easily make large investments into venture capital and similar funds.

source

“When the accumulation of wealth is no longer of high social importance, there will be great changes in the code of morals”*…

It’s said that nothing lasts forever…

In 1930, the English economist John Maynard Keynes took a break from writing about the problems of the interwar economy and indulged in a bit of futurology. In an essay entitled “Economic Possibilities for Our Grandchildren,” he speculated that by the year 2030 capital investment and technological progress would have raised living standards as much as eightfold, creating a society so rich that people would work as little as fifteen hours a week, devoting the rest of their time to leisure and other “non-economic purposes.” As striving for greater affluence faded, he predicted, “the love of money as a possession . . . will be recognized for what it is, a somewhat disgusting morbidity.”

This transformation hasn’t taken place yet, and most economic policymakers remain committed to maximizing the rate of economic growth. But Keynes’s predictions weren’t entirely off base. After a century in which G.D.P. per person has gone up more than sixfold in the United States, a vigorous debate has arisen about the feasibility and wisdom of creating and consuming ever more stuff, year after year. On the left, increasing alarm about climate change and other environmental threats has given birth to the “degrowth” movement, which calls on advanced countries to embrace zero or even negative G.D.P. growth. “The faster we produce and consume goods, the more we damage the environment,” Giorgos Kallis, an ecological economist at the Autonomous University of Barcelona, writes in his manifesto, “Degrowth.” “There is no way to both have your cake and eat it, here. If humanity is not to destroy the planet’s life support systems, the global economy should slow down.” In “Growth: From Microorganisms to Megacities,” Vaclav Smil, a Czech-Canadian environmental scientist, complains that economists haven’t grasped “the synergistic functioning of civilization and the biosphere,” yet they “maintain a monopoly on supplying their physically impossible narratives of continuing growth that guide decisions made by national governments and companies.”

Once confined to the margins, the ecological critique of economic growth has gained widespread attention. At a United Nations climate-change summit in September, the teen-age Swedish environmental activist Greta Thunberg declared, “We are in the beginning of a mass extinction, and all you can talk about is money and fairy tales of eternal economic growth. How dare you!” The degrowth movement has its own academic journals and conferences. Some of its adherents favor dismantling the entirety of global capitalism, not just the fossil-fuel industry. Others envisage “post-growth capitalism,” in which production for profit would continue, but the economy would be reorganized along very different lines. In the influential book “Prosperity Without Growth: Foundations for the Economy of Tomorrow,” Tim Jackson, a professor of sustainable development at the University of Surrey, in England, calls on Western countries to shift their economies from mass-market production to local services—such as nursing, teaching, and handicrafts—that could be less resource-intensive. Jackson doesn’t underestimate the scale of the changes, in social values as well as in production patterns, that such a transformation would entail, but he sounds an optimistic note: “People can flourish without endlessly accumulating more stuff. Another world is possible.”

Even within mainstream economics, the growth orthodoxy is being challenged, and not merely because of a heightened awareness of environmental perils. In “Good Economics for Hard Times,” two winners of the 2019 Nobel Prize in Economics, Abhijit Banerjee and Esther Duflo, point out that a larger G.D.P. doesn’t necessarily mean a rise in human well-being—especially if it isn’t distributed equitably—and the pursuit of it can sometimes be counterproductive. “Nothing in either our theory or the data proves the highest G.D.P. per capita is generally desirable,” Banerjee and Duflo, a husband-and-wife team who teach at M.I.T., write…

As the estimable John Cassidy (@JohnCassidy) explains, the critique of economic growth, once a fringe position, is gaining widespread attention in the face of the climate crisis: “Can We Have Prosperity Without Growth?

See also Branko Milanovic (@BrankoMilan): “Degrowth: solving the impasse by magical thinking.”

* “When the accumulation of wealth is no longer of high social importance, there will be great changes in the code of morals. We shall be able to rid ourselves of many of the pseudo-moral principles which have hag-ridden us for two hundred years, by which we have exalted some of the most distasteful of human qualities into the position of the highest virtues. We shall be able to afford to dare to assess the money-motive at its true value. The love of money as a possession — as distinguished from the love of money as a means to the enjoyments and realities of life — will be recognized for what it is, a somewhat disgusting morbidity, one of those semi-criminal, semi-pathological propensities which one hands over with a shudder to the specialists in mental disease.” — John Maynard Keynes, Economic Possibilities for Our Grandchildren

###

As we internalize externalities, we might send carefully-conserved birthday greetings to Gifford Pinchot; he was born on this date in 1865.  An American forester, he became the first chief of the Forest Service in 1905. By 1910, with President Theodore Roosevelt’s backing, he built 60 forest reserves covering 56 million acres into 150 national forests covering 172 million acres.  Roosevelt’s successor, President Taft– no environmentalist– fired Pinchot.  Still Pinchot’s efforts earned him the honorific, “the father of conservation.”

source

Written by (Roughly) Daily

August 11, 2021 at 1:00 am

“Everything we care about lies somewhere in the middle, where pattern and randomness interlace”*…

True randomness (it’s lumpy)

We tend dramatically to underestimate the role of randomness in the world…

Arkansas was one out away from the 2018 College World Series championship, leading Oregon State in the series and 3-2 in the ninth inning of the game when Cadyn Grenier lofted a foul pop down the right-field line. Three Razorbacks converged on the ball and were in position to make a routine play on it, only to watch it fall untouched to the ground in the midst of them. Had any one of them made the play, Arkansas would have been the national champion.

Nobody did.

Given “another lifeline,” Grenier hit an RBI single to tie the game before Trevor Larnach launched a two-run homer to give the Beavers a 5-3 lead and, ultimately, the game. “As soon as you see the ball drop, you know you have another life,” Grenier said. “That’s a gift.” The Beavers accepted the gift eagerly and went on win the championship the next day as Oregon State rode freshman pitcher Kevin Abel to a 5-0 win over Arkansas in the deciding game of the series. Abel threw a complete game shutout and retired the last 20 hitters he faced.

The highly unlikely happens pretty much all the time…

We readily – routinely – underestimate the power and impact of randomness in and on our lives. In his book, The Drunkard’s Walk, Caltech physicist Leonard Mlodinow employs the idea of the “drunkard’s [random] walk” to compare “the paths molecules follow as they fly through space, incessantly bumping, and being bumped by, their sister molecules,” with “our lives, our paths from college to career, from single life to family life, from first hole of golf to eighteenth.” 

Although countless random interactions seem to cancel each another out within large data sets, sometimes, “when pure luck occasionally leads to a lopsided preponderance of hits from some particular direction…a noticeable jiggle occurs.” When that happens, we notice the unlikely directional jiggle and build a carefully concocted story around it while ignoring the many, many random, counteracting collisions.

As Tversky and Kahneman have explained, “Chance is commonly viewed as a self-correcting process in which a deviation in one direction induces a deviation in the opposite direction to restore the equilibrium. In fact, deviations are not ‘corrected’ as a chance process unfolds, they are merely diluted.”

As Stephen Jay Gould famously argued, were we able to recreate the experiment of life on Earth a million different times, nothing would ever be the same, because evolution relies upon randomness. Indeed, the essence of history is contingency.

Randomness rules.

Luck matters. A lot. Yet, we tend dramatically to underestimate the role of randomness in the world.

The self-serving bias is our tendency to see the good stuff that happens as our doing (“we worked really hard and executed the game plan well”) while the bad stuff isn’t our fault (“It just wasn’t our night” or “we simply couldn’t catch a break” or “we would have won if the umpiring hadn’t been so awful”). Thus, desirable results are typically due to our skill and hard work — not luck — while lousy results are outside of our control and the offspring of being unlucky.

Two fine books undermine this outlook by (rightly) attributing a surprising amount of what happens to us — both good and bad – to luck. Michael Mauboussin’s The Success Equation seeks to untangle elements of luck and skill in sports, investing, and business. Ed Smith’s Luck considers a number of fields – international finance, war, sports, and even his own marriage – to examine how random chance influences the world around us. For example, Mauboussin describes the “paradox of skill” as follows: “As skill improves, performance becomes more consistent, and therefore luck becomes more important.” In investing, therefore (and for example), as the population of skilled investors has increased, the variation in skill has narrowed, making luck increasingly important to outcomes.

On account of the growth and development of the investment industry, John Bogle could quite consistently write his senior thesis at Princeton on the successes of active fund management and then go on to found Vanguard and become the primary developer and intellectual forefather of indexing. In other words, the ever-increasing aggregate skill (supplemented by massive computing power) of the investment world has come largely to cancel itself out.

After a big or revolutionary event, we tend to see it as having been inevitable. Such is the narrative fallacy. In this paper, ESSEC Business School’s Stoyan Sgourev notes that scholars of innovation typically focus upon the usual type of case, where incremental improvements rule the day. Sgourev moves past the typical to look at the unusual type of case, where there is a radical leap forward (equivalent to Thomas Kuhn’s paradigm shifts in science), as with Picasso and Les Demoiselles

As Sgourev carefully argued, the Paris art market of Picasso’s time had recently become receptive to the commercial possibilities of risk-taking. Thus, artistic innovation was becoming commercially viable. Breaking with the past was then being encouraged for the first time. It would soon be demanded.

Most significantly for our purposes, Sgourev’s analysis of Cubism suggests that having an exceptional idea isn’t enough. For radical innovation really to take hold, market conditions have to be right, making its success a function of luck and timing as much as genius. Note that Van Gogh — no less a genius than Picasso — never sold a painting in his lifetime.

As noted above, we all like to think that our successes are earned and that only our failures are due to luck – bad luck. But the old expression – it’s better to be lucky than good – is at least partly true. That said, it’s best to be lucky *and* good. As a consequence, in all probabilistic fields (which is nearly all of them), the best performers dwell on process and diversify their bets. You should do the same…

As [Nate] Silver emphasizes in The Signal and the Noise, we readily overestimate the degree of predictability in complex systems [and t]he experts we see in the media are much too sure of themselves (I wrote about this problem in our industry from a slightly different angle…). Much of what we attribute to skill is actually luck.

Plan accordingly.

Taking the unaccountable into account: “Randomness Rules,” from Bob Seawright (@RPSeawright), via @JVLast

[image above: source]

* James Gleick, The Information: A History, a Theory, a Flood

###

As we contemplate chance, we might spare a thought for Oskar Morgenstern; he died on this date in 1977. An economist who fled Nazi Germany for Princeton, he collaborated with the mathematician John von Neumann to write Theory of Games and Economic Behavior, published in 1944, which is recognized as the first book on game theory— thus co-founding the field.

Game theory was developed extensively in the 1950s, and has become widely recognized as an important tool in many fields– perhaps especially in the study of evolution. Eleven game theorists have won the economics Nobel Prize, and John Maynard Smith was awarded the Crafoord Prize for his application of evolutionary game theory.

Game theory’s roots date back (at least) to the 1654 letters between Pascal and Fermat, which (along with work by Cardano and Huygens) marked the beginning of probability theory. (See Peter Bernstein’s marvelous Against the Gods.) The application of probability (Bayes’ rule, discrete and continuous random variables, and the computation of expectations) accounts for the utility of game theory; the role of randomness (along with the behavioral psychology of a game’s participants) explain why it’s not a perfect predictor.

source

Written by (Roughly) Daily

July 26, 2021 at 1:00 am

%d bloggers like this: