Below are two pages of the hypothetical book on macro. Once (If?) I get to the main part of the book, I will need good sources for analysis of pre-1920’s macroeconomic events and for postwar events in other countries.In Search of the Northwest Passage
Macroeconomics is concerned with the causes and cures for fluctuations in output, employment, and inflation. However, this does not make macro a self-contained subject. Instead, macroeconomic thinking is shaped by overlaps with many other fields of economics, including several branches of microeconomics. The differences among major schools of thought in macro can be described in terms of which overlaps are emphasized and which are disregarded.
Think of the macroeconomist as an explorer, in search of a Northwest Passage that will provide clear, direct analysis of fluctuations in output, employment, and inflation. Other branches of economics represent possible paths to take to try to locate this Northwest Passage. For example, because unemployment is such a mystifying phenomenon, a case can be made that one should search for answers along the path of labor economics. By the same token, many economists have gone down the path of monetary theory–seeking explanations for macroeconomic fluctuations in the peculiar characteristics of money and its role in the economy.
This chapter will describe the many ways in which other branches of economics have provided paths that at least some macroeconomists thought were worth pursuing in order to find the elusive Northwest Passage. These branches include general equilibrium theory, labor economics, industrial organization, international trade, growth theory, finance theory, capital theory, monetary economics, behavioral economics, rational expectations and mathematical optimization, and econometrics. Before discussing these overlaps, I want to mention an overlap that shapes my own current thinking about macro.
The Recalculation Problem
In December of 2008, George Mason University Professor Tyler Cowen wrote on his blog,
Is the financial crisis — which is rapidly becoming the “real economy” crisis — somehow the “dual” of the socialist calculation problem?
…Are there conditions, however rare, under which market adjustment and convergence does not occur? If a few of the vertices get stuck, can it become impossible for the economy to fulfill its mutating pinwheel program of change and adaptation?
This suggests a rather surprising overlap, between macroeconomics and what is known as the socialist calculation debate. That debate, which was sparked in the 1920’s, burned hotly in the 1930’s, and settled to embers by the 1960’s, concerned the question of whether a socialist planner could ever have enough information to manage an economy efficiently. The anti-socialists, notably Mises and Hayek, argued that without market prices a planner would be making wild guesses as to the best use of productive resources. Pro-socialists, notably Oskar Lange and Wassily Leontief, argued that a planner could tease out the information by cleverly querying managers of socialist enterprises in order to map out the production possibilities and trade-offs in the economy.
Both sides of the socialist calculation question were focused on whether central planners could obtain the information needed to allocate resources efficiently. But neither side doubted that markets were capable of generating and disseminating the relevant information. This assumption could have been challenged, in which case the socialist calculation question would have been accompanied by a market calculation question.
The market calculation question is whether the information needed to allocate resources effectively is generated and disseminated by markets. It seems likely that if the market were close to general equilibrium, then prices would represent accurate signals of where resources belong. However, if the economy finds itself far from general equilibrium, there is no mechanism ensuring that it will find its way to general equilibrium quickly In the meantime, prices could easily be sending signals to market participants that are unclear, inadequate, or even misleading.
In a modern economy, patterns of specialization are complicated. These patterns evolve over time, but typically the changes are gradual. The price system does an effective job of guiding the gradual, evolutionary changes in the patterns of specialization that take place near general equilibrium.
On the other hand, suppose that a dramatic shock renders the current pattern of specialization untenable, requiring major adjustments in terms of how productive resources are allocated. Is the market up to the task? Mathematical economists who looked at dynamic adjustment of the economy outside of general equilibrium did not provide encouraging results. However, with a few exceptions, notably Robert Clower and Axel Leijonhufvud, most economists did not take up the issue of disequilibrium dynamics as a factor in macroeconomics.
Models of disequilibrium dynamics lack mathematical tractability. I cannot fix that. I would say that insisting on mathematical tractability is behaving like the drunk who refuses to search for his watch where he lost it because it is too dark there If the best place to look for explanations of recessions is in disequilibrium dynamics, then the fact that this path of economics is poorly lit by the lamp post of mathematical tractability should not drive us to search elsewhere.
My term for disequilibrium dynamics is Recalculation. This is reminiscent of a GPS navigation system which, after you deviate from its route, will say “Recalculating.” I also think in terms of a project manager at a construction site who discovers a problem with some of the work that has been completed. The manager halts all activity in order to recalculate how to proceed in view of the need to go back and fix the problem.
I will be returning to the Recalculate Story later in this book, because it is one of the paths to the Northwest Passage that I consider important. However, let us leave this path for now and go back to consider other, more well-trodden paths.
READER COMMENTS
david
Jun 25 2010 at 6:36pm
Nicely written.
Tangentially: to describe the “socialist calculation debate” as between socialists and anti-socialists is slightly misleading; the debate was between proponents and opponents of the neoclassical approach. It is for this reason that distinctly anti-socialist people like Joseph Schumpeter conceded that in principle the ‘socialists’ were right that it was technically achievable under neoclassical assumptions.
Here’s none other than Murray Rothbard on the topic. I would quote post-Keynesians making the same argument if I could remember who to google for.
If I may, I also suggest mining any of the numerous, numerous approaches to out-of-equilibrium economics that already exist for insights. There are so many that the whole exercise is sorely in need of unifying contributions rather than another attempt to put together yet another arcane framework. Post-Keynesians always like to emphasize their non-equilibrium approaches. The assorted econophysicists have a million hysteresis and path-dependency models to choose from. The agent-based modelers have their own adaptive or generational or whatever models. If anything there is a surfeit of mathematical tractability.
Doc Merlin
Jun 26 2010 at 5:59pm
Someone may have already mentioned this, but, mathematically, you may want to look at something similar to simulated annealing.
Tom Dougherty
Jun 26 2010 at 8:25pm
What if the reason why markets are out of equilibrium is because an increase in the demand for money is not matched by a corresponding increase in the supply of money causing a fall in aggregate demand. The economy is in disequilibrium, but should markets be blamed for this? Should we say that markets cannot calculate because the central planner of the money supply could not tease out the information in a timely fashion to offset the increase in the demand for money? If 2/3rds of the current economic downturn is caused by poor central planning of the money supply, then I don’t see how markets can be blamed for its lack of ability to calculate.
We have had an unprecedented drop in aggregate demand not seen in the post WWII era, where there were 5 straight negative quarters of year-on-year percent change in aggregate demand from the 4th quarter of 2008 through the 4th quarter of 2009. Only in the 1st quarter of 2010 has there been a modest recovery. There are signs of a possible double dip in aggregate demand in the 2nd quarter. But this is a failure of the central bank and central planning. I hardly think it’s appropriate to blame markets for not functioning properly and say they cannot calculate due to these shocks to the economy perpetrated by the central bank.
Ben Daniels
Jun 27 2010 at 11:02am
My favorite general-disequilibrium books (odds are, of course, that you’ve read them already):
Financial Crises and What to Do About Them, by Barry Eichengreen.
Manias, Panics and Crashes: A History of Financial Crises, by Charles Kindleberger and Robert Aliber.
Asian Storm: The Economic Crisis Examined, by Philippe Ries. (A specific case with great ground-level reportage)
Good luck with the book!
ionides
Jun 27 2010 at 11:52pm
I hope you don’t make long elaborate analogies to the Northwest Passage. It’s easy to fall into this trap and as a reader I get impatient when I see it.
david
Jun 28 2010 at 10:30pm
It seems to me that it depends on how one defines sustainable economic activity. If one defines it as self-sustaining or profitable activity which satisfies consumer needs or wants, then the market process, no matter how flawed, is the only mechanism which can provide the information (largely prices) by which dynamic adjustment can begin to take place.
Monetary disequilibrium obviously doesn’t help, but aside from that, fiscal stimulus degrades/confuses the signals provided by the market by replacing consumers’ preferences with policy makers’ preferences. Among other things, this keeps resources in uneconomic uses for longer and delays price effects that would guide entrepreneurs in redeploying resources. Entrepreneurs are aware that market signals are “polluted” and are also concerned about government policy flailing (Robert Higgs’ regime uncertainty) and thus keep their powder dry, awaiting the time when they can have more confidence in market data. In other words, they are reluctant to invest (and employ). Arnold’s recalculation is put on hold.
The socialist calculation debate was about the efficacy of central planning. Fiscal stimulus is a subset of central planning. Perhaps governments can get away with it during inventory adjustments (i.e., recessions) where no significant restructuring is required. If restructuring is required, fiscal stimulus may cushion the short run negative impact (that would otherwise occur due to necessary but temporary idling of resources) but will essentially stop the adjustment process, and sustainable real growth, in its tracks. Again, investors and entrepreneurs understand this (expectations are a factor in fiscal policy as well as monetary policy). The result is an increase in the demand for money as an asset, thus frustrating what most, except Scott Sumner, believe to be expansionary monetary policy.
Chris Koresko
Jun 29 2010 at 2:03am
@Doc Merlin: Yup, Simulated Annealing was mentioned a few days ago as a model for Recalculation. I believe the poster was named Doug.
Simulated Annealing (SA) is a stochastic global-optimization approach which is widely used in cases which are too complicated to be solved in closed form or by brute force. This makes it a natural picture to consider in the macro-economic context.
In an SA study of a metal, you may produce neatly-organized crystals from an initial ‘hot’ disordered mass of atoms. Order from disorder, without central planning, just as in nature.
In SA, one defines a performance metric, which might be a total system energy for example, that depends on the arrangement of a large collection of sub-units (atoms in a cooling mass of metal, for example, or perhaps flakes of cereal settling in a box). The system is subjected to a series of trials consisting of small localized random changes. These changes are typically always accepted if they improve the performance metric, and are accepted with some probability if they make it worse. That probability depends on a combination of the size of the degradation (lower probability for larger degradation) and a “temperature” (higher probability for higher temperature). The temperature is gradually reduced as the process proceeds.
Note that each ‘atom’ (person? company?) typically is interacting only with its close neighbors at any given time. There is no need for an individual atom to know what is going on at a distance many times as large as the distance to those near neighbors. This gets around the information problem: the information needed to optimize the system is dispersed across all the atoms in it; there is no central control point. Computationally, it avoids the need to compute and store the value of the performance metric for a potentially vast number of possible configurations, as would be needed to implement a deterministic global optimization.
In SA, you do not expect to find the true global optimum, but you do expect, with very high probability, to find a solution which is nearly as good as the global optimum. Multiple runs with the same starting position and different random number seeds tend to produce solutions which are nearly equally good, but which can differ greatly in detail.
Picture pouring a kilogram of breakfast cereal into a box and measuring the volume it occupies. Then start shaking the box, with the goal of minimizing that volume. Over time, the cereal settles as individual flakes find “better” positions and orientations, i.e., better relationships with their neighbors. This happens quickly at first. Gradually you reduce the vigor of the shaking, and the level of the cereal settles down to something stable (small fluctuations). If you repeat the experiment many times, the positions of the individual flakes will be different, but the final level will be about the same.
Now replace minimum-total-volume with maximum-total-output, flakes of cereal with economic actors, shaking with profit-seeking interactions based on limited information about a small subset of the other actors, wave hands vigorously, and voila!
Lord
Jun 29 2010 at 3:31pm
Even if primarily a recalculation problem, half would still be a demand problem, that of the lack of income the recalculating sectors earn and no longer have to spend on non-recalculating sectors. This is why government spending can be so helpful whatever it is spent on because it creates income that is spent on whatever is desired. This is only true though if there is a solution to the recalculation that leads back to full employment and output. If there is no productive use for these resources then not even government can make them productive for long and all it can to do is ease the transition to that reduced state.
Comments are closed.