Saturday, September 25, 2004
Rational Expectations
by Thomas J. Sargent
The theory of rational expectations was first proposed by John F. Muth of Indiana University in the early sixties. He used the term to describe the many economic situations in which the outcome depends partly upon what people expect to happen. The price of an agricultural commodity, for example, depends on how many acres farmers plant, which in turn depends on the price that farmers expect to realize when they harvest and sell their crops. As another example, the value of a currency and its rate of depreciation depend partly on what people expect that rate of depreciation to be. That is because people rush to desert a currency that they expect to lose value, thereby contributing to its loss in value. Similarly, the price of a stock or bond depends partly on what prospective buyers and sellers believe it will be in the future.
The use of expectations in economic theory is not new. Many earlier economists, including A. C. Pigou, John Maynard Keynes, and John R. Hicks, assigned a central role in the determination of the business cycle to people's expectations about the future. Keynes referred to this as "waves of optimism and pessimism" that helped determine the level of economic activity. But proponents of the rational expectations theory are more thorough in their analysis of—and assign a more important role to—expectations.
The influences between expectations and outcomes flow both ways. In forming their expectations, people try to forecast what will actually occur. They have strong incentives to use forecasting rules that work well because higher "profits" accrue to someone who acts on the basis of better forecasts, whether that someone be a trader in the stock market or someone considering the purchase of a new car. And when people have to forecast a particular price over and over again, they tend to adjust their forecasting rules to eliminate avoidable errors. Thus, there is continual feedback from past outcomes to current expectations. Translation: in recurrent situations the way the future unfolds from the past tends to be stable, and people adjust their forecasts to conform to this stable pattern.
The concept of rational expectations asserts that outcomes do not differ systematically (i.e., regularly or predictably) from what people expected them to be. The concept is motivated by the same thinking that led Abraham Lincoln to assert, "You can fool some of the people all of the time, and all of the people some of the time, but you cannot fool all of the people all of the time." From the viewpoint of the rational expectations doctrine, Lincoln's statement gets things right. It does not deny that people often make forecasting errors, but it does suggest that errors will not persistently occur on one side or the other.
Economists who believe in rational expectations base their belief on the standard economic assumption that people behave in ways that maximize their utility (their enjoyment of life) or profits. Economists have used the concept of rational expectations to understand a variety of situations in which speculation about the future is a crucial factor in determining current action. Rational expectations is a building block for the "random walk" or "efficient markets" theory of securities prices, the theory of the dynamics of hyperinflations, the "permanent income" and "life-cycle" theories of consumption, the theory of "tax smoothing," and the design of economic stabilization policies.
The Efficient Markets Theory of Stock Prices
One of the earliest and most striking applications of the concept of rational expectations is the efficient markets theory of asset prices. A sequence of observations on a variable (such as daily stock prices) is said to follow a random walk if the current value gives the best possible prediction of future values. The efficient markets theory of stock prices uses the concept of rational expectations to reach the conclusion that, when properly adjusted for discounting and dividends, stock prices follow a random walk. The chain of reasoning goes as follows. In their efforts to forecast prices, investors comb all sources of information, including patterns that they can spot in past price movements.
Investors buy stocks that they expect to have a higher-than-average return and sell those that they expect to have lower returns. When they do so, they bid up the prices of stocks expected to have higher-than-average returns and drive down the prices of those expected to have lower-than-average returns. The prices of the stocks adjust until the expected returns, adjusted for risk, are equal for all stocks. Equalization of expected returns means that investors' forecasts become built into or reflected in the prices of stocks. More precisely, it means that stock prices change so that after an adjustment to reflect dividends, the time value of money, and differential risk, they equal the market's best forecast of the future price. Therefore, the only factors that can change stock prices are random factors that could not be known in advance. Thus, changes in stock prices follow a random walk.
The random walk theory has been subjected to literally hundreds of empirical tests. The tests tend to support the theory quite strongly. While some studies have found situations that contradict the theory, the theory does explain, at least to a very good first approximation, how asset prices evolve (see Efficient Capital Markets).
The Permanent Income Theory of Consumption
The Keynesian consumption function holds that there is a positive relationship between people's consumption and their income. Early empirical work in the forties and fifties encountered some discrepancies from the theory, which Milton Friedman successfully explained with his celebrated "permanent income theory" of consumption. Friedman built upon Irving Fisher's insight that a person's consumption ought not to depend on current income alone, but also on prospects of income in the future. Friedman posited that people consume out of their "permanent income," which can be defined as the level of consumption that can be sustained while leaving wealth intact. In defining "wealth," Friedman included a measure of "human wealth"—namely, the present value of people's expectations of future labor income.
Although Friedman did not formally apply the concept of rational expectations in his work, it is implicit in much of his discussion. Because of its heavy emphasis on the role of expectations about future income, his hypothesis was a prime candidate for the application of rational expectations. In work subsequent to Friedman's, John F. Muth and Stanford's Robert E. Hall imposed rational expectations on versions of Friedman's model, with interesting results. In Hall's version, imposing rational expectations produces the result that consumption is a random walk: the best prediction of future consumption is the present level of consumption. This result encapsulates the consumption-smoothing aspect of the permanent income model and reflects people's efforts to estimate their wealth and to allocate it over time. If consumption in each period is held at a level that is expected to leave wealth unchanged, it follows that wealth and consumption will each equal their values in the previous period plus an unforecastable or unforeseeable random shock—really a forecast error.
The rational expectations version of the permanent income hypothesis has changed the way economists think about short-term stabilization policies (such as temporary tax cuts) designed to stimulate the economy. Keynesian economists used to believe that tax cuts would boost disposable income and thus cause people to consume more. But according to the permanent income model, temporary tax cuts would have much less of an effect on consumption than Keynesians had thought. The reason is that people are basing their consumption decision on their wealth, not their current disposable income. Because temporary tax cuts are bound to be reversed, they have little or no effect on wealth, and therefore, they have little or no effect on consumption. Thus, the permanent income model had the effect of diminishing the expenditure "multiplier" that economists ascribed to temporary tax cuts.
The rational expectations version of the permanent income model had been extensively tested, with results that are quite encouraging. The evidence is that the model works well but imperfectly. Economists are currently extending the model to take into account factors such as "habit persistence" in consumption and the differing durabilities of various consumption goods. Expanding the theory to incorporate these features alters the pure "random walk" prediction of the theory (and so helps remedy some of the empirical shortcomings of the model), but it leaves the basic permanent income insight intact.
Tax-Smoothing Models
How should a government design tax policy when it knows that people are making decisions partly in response to the government's plans for setting taxes in the future? That is, when participants in the private sector have rational expectations about the government's rules for setting tax rates, what rules should the government use to set tax rates? Robert Lucas and Nancy Stokey, as well as Robert Barro, have studied this problem under the assumption that the government can make and keep commitments to execute the plans that it designs. All three authors have identified situations in which the government should finance a volatile (or unsmooth) sequence of government expenditures with a sequence of tax rates that is quite stable (or smooth) over time. Such policies are called "tax-smoothing" policies. Tax smoothing is a good idea because it minimizes the supply disincentives associated with taxes. For example, workers who pay a 20 percent marginal tax rate every year will reduce their labor supply less (that is, will work more at any given wage) than they would if the government set a 10 percent marginal tax rate in half the years and a 30 percent rate in the other half.
During "normal times" a government operating under a tax-smoothing rule typically has close to a balanced budget. But during times of extraordinary expenditures—during wars, for example—the government runs a deficit, which it finances by borrowing. During and after the war the government increases taxes by enough to service the debt it has occurred; in this way the higher taxes that the government imposes to finance the war are spread out over time. Such a policy minimizes the cumulative distorting effects of taxes—the adverse "supply-side" effects.
Barro's tax-smoothing theory helps explain the behavior of the British and U.S. governments in the eighteenth and nineteenth centuries, when the standard pattern was to finance wars with deficits but to set taxes after wars at rates sufficiently high to service the government's debt.
Expectational Error Models of the Business Cycle
A long tradition in business cycle theory has held that errors in people's forecasts are a major cause of business fluctuations. This view was embodied in the Phillips curve (the observed inverse correlation between unemployment and inflation), with economists attributing the correlation to errors that people made in their forecasts of the price level. Before the advent of rational expectations, economists often proposed to "exploit" or "manipulate" the public's forecasting errors in ways designed to generate better performance of the economy over the business cycle. Thus, Robert Hall aptly described the state of economic thinking in 1973 when he wrote:
The benefits of inflation derive from the use of expansionary policy to trick economic agents into behaving in socially preferable ways even though their behavior is not in their own interest.... The gap between actual and expected inflation measures the extent of the trickery.... The optimal policy is not nearly as expansionary [inflationary] when expectations adjust rapidly, and most of the effect of an inflationary policy is dissipated in costly anticipated inflation.
Rational expectations undermines the idea that policymakers can manipulate the economy by systematically making the public have false expectations. Robert Lucas showed that if expectations are rational, it simply is not possible for the government to manipulate those forecast errors in a predictable and reliable way for the very reason that the errors made by a rational forecaster are inherently unpredictable. Lucas's work led to what has sometimes been called the "policy ineffectiveness proposition." If people have rational expectations, policies that try to manipulate the economy by inducing people into having false expectations may introduce more "noise" into the economy but cannot, on average, improve the economy's performance.
Design of Macroeconomic Policies
The "policy ineffectiveness" result pertains only to those economic policies that have their effects solely by inducing forecast errors. Many government policies work by affecting "margins" or incentives, and the concept of rational expectations delivers no "policy ineffectiveness" result for such policies. In fact, the idea of rational expectations is now being used extensively in such contexts to study the design of monetary, fiscal, and regulatory policies to promote good economic performance.
For example, extensions of the tax-smoothing models are being developed in a variety of directions. The tax-smoothing result depends on various special assumptions about the physical technology for transferring resources over time, and also on the sequence of government expenditures assumed. These assumptions are being relaxed, with interesting modifications of the tax-smoothing prescription being a consequence. Christophe Chamley reached the striking conclusion that an optimal tax scheme involves eventually setting the tax rate on capital to zero, with labor bearing the entire tax burden. To get his result, Chamley assumed that "labor" and "capital" are very different factors, with the total availability of labor being beyond people's control while the supply of capital could be affected by investment and saving. When Chamley's assumptions are altered to acknowledge the "human capital" component of labor, which can be affected by people's decisions, his conclusion about capital taxation is different.
The idea of rational expectations has also been a workhorse in developing prescriptions for optimally choosing monetary policy. Important contributors to this literature have been Truman Bewley and William A. Brock. Bewley and Brock's work describes precisely the contexts in which an optimal monetary arrangement involves having the government pay interest on reserves at the market rate. Their work supports, clarifies, and extends proposals to monetary reform made by Milton Friedman in 1960 and 1968.
Rational expectations has been a working assumption in recent studies that try to explain how monetary and fiscal authorities can retain (or lose) "good reputations" for their conduct of policy. This literature is beginning to help economists understand the multiplicity of government policy strategies followed, for example, in high-inflation and low-inflation countries. In particular, work on "reputational equilibria" in macroeconomics by Robert Barro and by David Gordon and Nancy Stokey has shown that the preferences of citizens and policymakers and the available production technologies and trading opportunities are not by themselves sufficient to determine whether a government will follow a low-inflation or a high-inflation policy mix.
Instead, reputation remains an independent factor even after rational expectations have been assumed.
posted by iambrianfu [ 11:21 AM ] <$BlogItemComments$>
Friday, September 24, 2004
New Keynesian Economics
by N. Gregory Mankiw
New Keynesian economics is the school of thought in modern macroeconomics that evolved from the ideas of John Maynard Keynes. Keynes wrote The General Theory of Employment, Interest, and Money in the thirties, and his influence among academics and policymakers increased through the sixties. In the seventies, however, new classical economists such as Robert Lucas, Thomas J. Sargent, and Robert Barro called into question many of the precepts of the Keynesian revolution. The label "new Keynesian" describes those economists who, in the eighties, responded to this new classical critique with adjustments to the original Keynesian tenets.
The primary disagreement between new classical and new Keynesian economists is over how quickly wages and prices adjust. New classical economists build their macroeconomic theories on the assumption that wages and prices are flexible. They believe that prices "clear" markets—balance supply and demand—by adjusting quickly. New Keynesian economists, however, believe that market-clearing models cannot explain short-run economic fluctuations, and so they advocate models with "sticky" wages and prices. New Keynesian theories rely on this stickiness of wages and prices to explain why involuntary unemployment exists and why monetary policy has such a strong influence on economic activity.
A long tradition in macroeconomics (including both Keynesian and monetarist perspectives) emphasizes that monetary policy affects employment and production in the short run because prices respond sluggishly to changes in the money supply. According to this view, if the money supply falls, people spend less money, and the demand for goods falls. Because prices and wages are inflexible and don't fall immediately, the decreased spending causes a drop in production and layoffs of workers. New classical economists criticized this tradition because it lacked a coherent theoretical explanation for the sluggish behavior of prices. Much new Keynesian research attempts to remedy this omission.
Menu Costs and Aggregate-Demand Externalities
One reason that prices do not adjust immediately to clear markets is that adjusting prices is costly. To change its prices, a firm may need to send out a new catalog to customers, distribute new price lists to its sales staff, or in the case of a restaurant, print new menus. These costs of price adjustment, called "menu costs," cause firms to adjust prices intermittently rather than continuously.
Economists disagree about whether menu costs can help explain short-run economic fluctuations. Skeptics point out that menu costs usually are very small. They argue that these small costs are unlikely to help explain recessions, which are very costly for society. Proponents reply that small does not mean inconsequential. Even though menu costs are small for the individual firm, they could have large effects on the economy as a whole.
Proponents of the menu-cost hypothesis describe the situation as follows. To understand why prices adjust slowly, one must acknowledge that changes in prices have externalities—that is, effects that go beyond the firm and its customers. For instance, a price reduction by one firm benefits other firms in the economy. When a firm lowers the price it charges, it lowers the average price level slightly and thereby raises real income. (Nominal income is determined by the money supply.) The stimulus from higher income, in turn, raises the demand for the products of all firms. This macroeconomic impact of one firm's price adjustment on the demand for all other firms' products is called an "aggregate-demand externality."
In the presence of this aggregate-demand externality, small menu costs can make prices sticky, and this stickiness can have a large cost to society. Suppose that General Motors announces its prices and then, after a fall in the money supply, must decide whether to cut prices. If it did so, car buyers would have a higher real income and would, therefore, buy more products from other companies as well. But the benefits to other companies are not what General Motors cares about. Therefore, General Motors would sometimes fail to pay the menu cost and cut its price, even though the price cut is socially desirable. This is an example in which sticky prices are undesirable for the economy as a whole, even though they may be optimal for those setting prices.
The Staggering of Prices
New Keynesian explanations of sticky prices often emphasize that not everyone in the economy sets prices at the same time. Instead, the adjustment of prices throughout the economy is staggered. Staggering complicates the setting of prices because firms care about their prices relative to those charged by other firms. Staggering can make the overall level of prices adjust slowly, even when individual prices change frequently.
Consider the following example. Suppose, first, that price setting is synchronized: every firm adjusts its price on the first of every month. If the money supply and aggregate demand rise on May 10, output will be higher from May 10 to June 1 because prices are fixed during this interval. But on June 1 all firms will raise their prices in response to the higher demand, ending the three-week boom.
Now suppose that price setting is staggered: Half the firms set prices on the first of each month and half on the fifteenth. If the money supply rises on May 10, then half the firms can raise their prices on May 15. Yet because half of the firms will not be changing their prices on the fifteenth, a price increase by any firm will raise that firm's relative price, which will cause it to lose customers. Therefore, these firms will probably not raise their prices very much. (In contrast, if all firms are synchronized, all firms can raise prices together, leaving relative prices unaffected.) If the May 15 price setters make little adjustment in their prices, then the other firms will make little adjustment when their turn comes on June 1, because they also want to avoid relative price changes. And so on. The price level rises slowly as the result of small price increases on the first and the fifteenth of each month. Hence, staggering makes the price level sluggish, because no firm wishes to be the first to post a substantial price increase.
Coordination Failure
Some new Keynesian economists suggest that recessions result from a failure of coordination. Coordination problems can arise in the setting of wages and prices because those who set them must anticipate the actions of other wage and price setters. Union leaders negotiating wages are concerned about the concessions other unions will win. Firms setting prices are mindful of the prices other firms will charge.
To see how a recession could arise as a failure of coordination, consider the following parable. The economy is made up of two firms. After a fall in the money supply, each firm must decide whether to cut its price. Each firm wants to maximize its profit, but its profit depends not only on its pricing decision but also on the decision made by the other firm.
If neither firm cuts its price, the amount of real money (the amount of money divided by the price level) is low, a recession ensues, and each firm makes a profit of only fifteen dollars.
If both firms cut their price, real money balances are high, a recession is avoided, and each firm makes a profit of thirty dollars. Although both firms prefer to avoid a recession, neither can do so by its own actions. If one firm cuts its price while the other does not, a recession follows. The firm making the price cut makes only five dollars, while the other firm makes fifteen dollars.
The essence of this parable is that each firm's decision influences the set of outcomes available to the other firm. When one firm cuts its price, it improves the opportunities available to the other firm, because the other firm can then avoid the recession by cutting its price. This positive impact of one firm's price cut on the other firm's profit opportunities might arise because of an aggregate-demand externality.
What outcome should one expect in this economy? On the one hand, if each firm expects the other to cut its price, both will cut prices, resulting in the preferred outcome in which each makes thirty dollars. On the other hand, if each firm expects the other to maintain its price, both will maintain their prices, resulting in the inferior solution, in which each makes fifteen dollars. Hence, either of these outcomes is possible: there are multiple equilibria.
The inferior outcome, in which each firm makes fifteen dollars, is an example of a coordination failure. If the two firms could coordinate, they would both cut their price and reach the preferred outcome. In the real world, unlike in this parable, coordination is often difficult because the number of firms setting prices is large. The moral of the story is that even though sticky prices are in no one's interest, prices can be sticky simply because people expect them to be.
Efficiency Wages
Another important part of new Keynesian economics has been the development of new theories of unemployment. Persistent unemployment is a puzzle for economic theory. Normally, economists presume that an excess supply of labor would exert a downward pressure on wages. A reduction in wages would, in turn, reduce unemployment by raising the quantity of labor demanded. Hence, according to standard economic theory unemployment is a self-correcting problem.
New Keynesian economists often turn to theories of what they call efficiency wages to explain why this market-clearing mechanism may fail. These theories hold that high wages make workers more productive. The influence of wages on worker efficiency may explain the failure of firms to cut wages despite an excess supply of labor. Even though a wage reduction would lower a firm's wage bill, it would also—if the theories are correct—cause worker productivity and the firm's profits to decline.
There are various theories about how wages affect worker productivity. One efficiency-wage theory holds that high wages reduce labor turnover. Workers quit jobs for many reasons—to accept better positions at other firms, to change careers, or to move to other parts of the country. The more a firm pays its workers, the greater their incentive to stay with the firm. By paying a high wage, a firm reduces the frequency of quits, thereby decreasing the time spent hiring and training new workers.
A second efficiency-wage theory holds that the average quality of a firm's work force depends on the wage it pays its employees. If a firm reduces wages, the best employees may take jobs elsewhere, leaving the firm with less productive employees who have fewer alternative opportunities. By paying a wage above the equilibrium level, the firm may avoid this adverse selection, improve the average quality of its work force, and thereby increase productivity.
A third efficiency-wage theory holds that a high wage improves worker effort. This theory posits that firms cannot perfectly monitor the work effort of their employees and that employees must themselves decide how hard to work. Workers can choose to work hard, or they can choose to shirk and risk getting caught and fired. The firm can raise worker effort by paying a high wage. The higher the wage, the greater is the cost to the worker of getting fired. By paying a higher wage, a firm induces more of its employees not to shirk and, thus, increases their productivity.
Policy Implications
Because new Keynesian economics is a school of thought regarding macroeconomic theory, its adherents do not necessarily share a single view about economic policy. At the broadest level new Keynesian economics suggests—in contrast to some new classical theories—that recessions do not represent the efficient functioning of markets. The elements of new Keynesian economics, such as menu costs, staggered prices, coordination failures, and efficiency wages, represent substantial departures from the assumptions of classical economics, which provides the intellectual basis for economists' usual justification of laissezfaire. In new Keynesian theories recessions are caused by some economy-wide market failure. Thus, new Keynesian economics provides a rationale for government intervention in the economy, such as countercyclical monetary or fiscal policy. Whether policymakers should intervene in practice, however, is a more difficult question that entails various political as well as economic judgments.
posted by iambrianfu [ 8:36 AM ] <$BlogItemComments$>
Keynesian economics is a theory of total spending in the economy (called aggregate demand) and of its effects on output and inflation. Although the term is used (and abused) to describe many things, six principal tenets seem central to Keynesianism. The first three describe how the economy works.
1. A Keynesian believes that aggregate demand is influenced by a host of economic decisions—both public and private—and sometimes behaves erratically. The public decisions include, most prominently, those on monetary and fiscal (i.e., spending and tax) policy. Some decades ago, economists heatedly debated the relative strengths of monetary and fiscal policy, with some Keynesians arguing that monetary policy is powerless, and some monetarists arguing that fiscal policy is powerless. Both of these are essentially dead issues today. Nearly all Keynesians and monetarists now believe that both fiscal and monetary policy affect aggregate demand. A few economists, however, believe in what is called debt neutrality—the doctrine that substitutions of government borrowing for taxes have no effects on total demand (more on this below).
2. According to Keynesian theory, changes in aggregate demand, whether anticipated or unanticipated, have their greatest short-run impact on real output and employment, not on prices. This idea is portrayed, for example, in Phillips curves that show inflation changing only slowly when unemployment changes. Keynesians believe the short run lasts long enough to matter. They often quote Keynes's famous statement "In the long run, we are all dead" to make the point.
Anticipated monetary policy (that is, policies that people expect in advance) can produce real effects on output and employment only if some prices are rigid—if nominal wages (wages in dollars, not in real purchasing power), for example, do not adjust instantly. Otherwise, an injection of new money would change all prices by the same percentage. So Keynesian models generally either assume or try to explain rigid prices or wages. Rationalizing rigid prices is hard to do because, according to standard microeconomic theory, real supplies and demands do not change if all nominal prices rise or fall proportionally.
But Keynesians believe that, because prices are somewhat rigid, fluctuations in any component of spending—consumption, investment, or government expenditures—cause output to fluctuate. If government spending increases, for example, and all other components of spending remain constant, then output will increase. Keynesian models of economic activity also include a so-called multiplier effect. That is, output increases by a multiple of the original change in spending that caused it. Thus, a $10 billion increase in government spending could cause total output to rise by $15 billion (a multiplier of 1.5) or by $5 billion (a multiplier of 0.5). Contrary to what many people believe, Keynesian analysis does not require that the multiplier exceed 1.0. For Keynesian economics to work, however, the multiplier must be greater than zero.
3. Keynesians believe that prices and, especially, wages respond slowly to changes in supply and demand, resulting in shortages and surpluses, especially of labor. Even though monetarists are more confident than Keynesians in the ability of markets to adjust to changes in supply and demand, many monetarists accept the Keynesian position on this matter. Milton Friedman, for example, the most prominent monetarist, has written: "Under any conceivable institutional arrangements, and certainly under those that now prevail in the United States, there is only a limited amount of flexibility in prices and wages." In current parlance, that would certainly be called a Keynesian position.
No policy prescriptions follow from these three beliefs alone. And many economists who do not call themselves Keynesian—including most monetarists—would, nevertheless, accept the entire list. What distinguishes Keynesians from other economists is their belief in the following three tenets about economic policy.
4. Keynesians do not think that the typical level of unemployment is ideal—partly because unemployment is subject to the caprice of aggregate demand, and partly because they believe that prices adjust only gradually. In fact, Keynesians typically see unemployment as both too high on average and too variable, although they know that rigorous theoretical justification for these positions is hard to come by. Keynesians also feel certain that periods of recession or depression are economic maladies, not efficient market responses to unattractive opportunities. (Monetarists, as already noted, have a deeper belief in the invisible hand.)
5. Many, but not all, Keynesians advocate activist stabilization policy to reduce the amplitude of the business cycle, which they rank among the most important of all economic problems. Here Keynesians and monetarists (and even some conservative Keynesians) part company by doubting either the efficacy of stabilization policy or the wisdom of attempting it.
This does not mean that Keynesians advocate what used to be called fine-tuning—adjusting government spending, taxes, and the money supply every few months to keep the economy at full employment. Almost all economists, including most Keynesians, now believe that the government simply cannot know enough soon enough to fine-tune successfully. Three lags make it unlikely that fine-tuning will work. First, there is a lag between the time that a change in policy is required and the time that the government recognizes this. Second, there is a lag between when the government recognizes that a change in policy is required and when it takes action. In the United States, this lag is often very long for fiscal policy because Congress and the administration must first agree on most changes in spending and taxes. The third lag comes between the time that policy is changed and when the changes affect the economy. This, too, can be many months. Yet many Keynesians still believe that more modest goals for stabilization policy—coarse-tuning, if you will—are not only defensible, but sensible. For example, an economist need not have detailed quantitative knowledge of lags to prescribe a dose of expansionary monetary policy when the unemployment rate is 10 percent or more—as it was in many leading industrial countries in the eighties.
6. Finally, and even less unanimously, many Keynesians are more concerned about combating unemployment than about conquering inflation. They have concluded from the evidence that the costs of low inflation are small. However, there are plenty of anti-inflation Keynesians. Most of the world's current and past central bankers, for example, merit this title whether they like it or not. Needless to say, views on the relative importance of unemployment and inflation heavily influence the policy advice that economists give and that policymakers accept. Keynesians typically advocate more aggressively expansionist policies than non-Keynesians.
Keynesians' belief in aggressive government action to stabilize the economy is based on value judgments and on the beliefs that (a) macroeconomic fluctuations significantly reduce economic well-being, (b) the government is knowledgeable and capable enough to improve upon the free market, and (c) unemployment is a more important problem than inflation.
The long, and to some extent, continuing battle between Keynesians and monetarists has been fought primarily over (b) and (c).
In contrast, the briefer and more recent debate between Keynesians and new classical economists has been fought primarily over (a) and over the first three tenets of Keynesianism—tenets that the monetarists had accepted. New classicals believe that anticipated changes in the money supply do not affect real output; that markets, even the labor market, adjust quickly to eliminate shortages and surpluses; and that business cycles may be efficient. For reasons that will be made clear below, I believe that the "objective" scientific evidence on these matters points strongly in the Keynesian direction.
Before leaving the realm of definition, however, I must underscore several glaring and intentional omissions.
First, I have said nothing about the rational expectations school of thought (see Rational Expectations). Like Keynes himself, many Keynesians doubt that school's view that people use all available information to form their expectations about economic policy. Other Keynesians accept the view. But when it comes to the large issues with which I have concerned myself, nothing much rides on whether or not expectations are rational. Rational expectations do not, for example, preclude rigid prices. Stanford's John Taylor and MIT's Stanley Fischer have constructed rational expectations models with sticky prices that are thoroughly Keynesian by my definition. I should note, though, that some new classicals see rational expectations as much more fundamental to the debate.
The second omission is the hypothesis that there is a "natural rate" of unemployment in the long run. Prior to 1970, Keynesians believed that the long-run level of unemployment depended on government policy, and that the government could achieve a low unemployment rate by accepting a high but steady rate of inflation. In the late sixties Milton Friedman, a monetarist, and Columbia's Edmund Phelps, a Keynesian, rejected the idea of such a long-run trade-off on theoretical grounds. They argued that the only way the government could keep unemployment below what they called the "natural rate" was with macroeconomic policies that would continuously drive inflation higher and higher. In the long run, they argued, the unemployment rate could not be below the natural rate. Shortly thereafter, Keynesians like Northwestern's Robert Gordon presented empirical evidence for Friedman's and Phelps's view. Since about 1972 Keynesians have integrated the "natural rate" of unemployment into their thinking. So the natural rate hypothesis played essentially no role in the intellectual ferment of the 1975-85 period.
Third, I have ignored the choice between monetary and fiscal policy as the preferred instrument of stabilization policy. Economists differ about this and occasionally change sides. By my definition, however, it is perfectly possible to be a Keynesian and still believe either that responsibility for stabilization policy should, in principle, be ceded to the monetary authority or that it is, in practice, so ceded.
Keynesian theory was much denigrated in academic circles from the midseventies until the mideighties. It has staged a strong comeback since then, however. The main reason appears to be that Keynesian economics was better able to explain the economic events of the seventies and eighties than its principal intellectual competitor, new classical economics.
True to its classical roots, new classical theory emphasizes the ability of a market economy to cure recessions by downward adjustments in wages and prices. The new classical economists of the midseventies attributed economic downturns to people's misperceptions about what was happening to relative prices (such as real wages). Misperceptions would arise, they argued, if people did not know the current price level or inflation rate. But such misperceptions should be fleeting and surely cannot be large in societies in which price indexes are published monthly and the typical monthly inflation rate is under 1 percent. Therefore, economic downturns, by the new classical view, should be mild and brief. Yet during the eighties most of the world's industrial economies endured deep and long recessions. Keynesian economics may be theoretically untidy, but it certainly is a theory that predicts periods of persistent, involuntary unemployment.
According to new classical theory, a correctly perceived decrease in the growth of the money supply should have only small effects, if any, on real output. Yet when the Federal Reserve and the Bank of England announced that monetary policy would be tightened to fight inflation, and then made good on their promises, severe recessions followed in each country. New classicals might claim that the tightening was unanticipated (because people did not believe what the monetary authorities said). Perhaps it was in part. But surely the broad contours of the restrictive policies were anticipated, or at least correctly perceived as they unfolded. Old-fashioned Keynesian theory, which says that any monetary restriction is contractionary because firms and individuals are locked into fixed-price contracts, not inflation-adjusted ones, seems more consistent with actual events.
An offshoot of new classical theory formulated by Harvard's Robert Barro is the idea of debt neutrality. Barro argues that inflation, unemployment, real GNP, and real national saving should not be affected by whether the government finances its spending with high taxes and low deficits or with low taxes and high deficits. Because people are rational, he argues, they will correctly perceive that low taxes and high deficits today must mean higher future taxes for them and their heirs. They will, Barro argues, cut consumption and increase their saving by one dollar for each dollar increase in future tax liabilities. Thus, a rise in private saving should offset any increase in the government's deficit. Naïve Keynesian analysis, by contrast, sees an increased deficit, with government spending held constant, as an increase in aggregate demand. If, as happened in the United States, the stimulus to demand is nullified by contractionary monetary policy, real interest rates should rise strongly. There is no reason, in the Keynesian view, to expect the private saving rate to rise.
The massive U.S. tax cuts between 1981 and 1984 provided something approximating a laboratory test of these alternative views. What happened? The private saving rate did not rise. Real interest rates soared, even though a surprisingly large part of the shock was absorbed by exchange rates rather than by interest rates. With fiscal stimulus offset by monetary contraction, real GNP growth was approximately unaffected; it grew at about the same rate as it had in the recent past. Again, this all seems more consistent with Keynesian than with new classical theory.
Finally, there was the European depression of the eighties, which was the worst since the depression of the thirties. The Keynesian explanation is straightforward. Governments, led by the British and German central banks, decided to fight inflation with highly restrictive monetary and fiscal policies. The anti-inflation crusade was strengthened by the European Monetary System, which, in effect, spread the stern German monetary policy all over Europe. The new classical school has no comparable explanation. New classicals, and conservative economists in general, argue that European governments interfere more heavily in labor markets (with high unemployment benefits, for example, and restrictions on firing workers). But most of these interferences were in place in the early seventies, when unemployment was extremely low
posted by iambrianfu [ 8:35 AM ] <$BlogItemComments$>
Friday, August 20, 2004
The numbers featuring in an ordinal utility function are thus not measuring any quantity of anything. A utility-function in which magnitudes do matter is called ‘cardinal’
It was said above that games of perfect information are the (logically) simplest sorts of games. This is so because in such games (as long as the games are finite, that is, terminate after a known number of actions) players and analysts can use a straightforward procedure for predicting outcomes. A rational player in such a game chooses her first action by considering each series of responses and counter-responses that will result from each action open to her. She then asks herself which of the available final outcomes brings her the HIGHEST UTILITY, and chooses the action that starts the chain leading to this outcome. This process is called BACKWARD INDUCTION (because the reasoning works backwards from eventual outcomes to present decision problems).
Trees are used to represent sequential games, because they show the order in which actions are taken by the players. However, games are sometimes represented on matrices rather than trees. This is the second type of mathematical object used to represent games. Matrices, unlike trees, simply show the outcomes, represented in terms of the players' utility functions, for every possible combination of strategies the players might use. For example, it makes sense to display the river-crossing game from Section 1 on a matrix, since in that game both the fugitive and the hunter have just one move each, and each chooses their move in ignorance of what the other has decided to do.
Prisoner's Dilemma:
Wherever one action for a player is superior to her other actions for each possible action by the opponent, we say that the first action strictly dominates the second
When people introduce the PD into popular discussions, you will sometimes hear them say that the police inspector must lock his prisoners into separate rooms so that they can't communicate with one another. The reasoning behind this idea seems obvious: if you could communicate, you'd surely see that you're both better off refusing, and could make an agreement to do so, no? This, one presumes, would remove your conviction that you must confess because you'll otherwise be sold up the river by your partner. In fact, however, this intuition is misleading and its conclusion is false.
When we represent the PD as a strategic-form game, we implicitly assume that the prisoners can't attempt collusive agreement since they choose their actions simultaneously. In this case, agreement before the fact can't help. If you are convinced that your partner will stick to the bargain then you can seize the opportunity to go scot-free by confessing. Of course, you realize that the same temptation will occur to her; but in that case you again want to make sure you confess, as this is your only means of avoiding your worst outcome. Your agreement comes to naught because you have no way of enforcing it; it constitutes what game theorists call ‘cheap talk’.
[My experience with Kiw Seng: Even if me and mervyn were able to communicate before we sent for questioning, the results may be unconfirmed still. We both agree that we should choose to refuse to confess since it brings the best for both of us. However, given that i know that mervyn is sure to confess, i may choose not to do so. my decision to confess or not depends on the difference in value i place with 1. the extra punishment i would be saved from if i confessed and he didnt, 2. the strength of the friendship i place with mervyn and fear of any backlash.]
Node: A point at which a player takes an action.
Initial node: The point at which the first action in the game occurs.
Terminal node: Any node which, if reached, ends the game. Each terminal node corresponds to an outcome.
Subgame: Any set of nodes and branches descending uniquely from one node.
Payoff: an ordinal utility number assigned to a player at an outcome.
Outcome: an assignment of a set of payoffs, one to each player in the game.
Strategy: a program instructing a player which action to take at every node in the tree where she could possibly be called on to make a choice
... We can put this another way: in a zero-sum game, my playing a strategy that maximizes my minimum payoff if you play the best you can, and your simultaneously doing the same thing, is just equivalent to our both playing our best strategies, so this pair of so-called ‘maximin’ procedures is guaranteed to find the unique solution to the game, which is its unique NE. (In tic-tac-toe, this is a draw. You can't do any better than drawing, and neither can I, if both of us are trying to win and trying not to lose.)
[trying to win and preventing from losing is two different things]
posted by iambrianfu [ 12:48 PM ] <$BlogItemComments$>
Part of the explanation for game theory's relatively late entry into the field lies in the problems with which economists had historically been concerned. Classical economists, such as Adam Smith and David Ricardo, were mainly interested in the question of how agents in very large markets -- whole nations -- could interact so as to bring about maximum monetary wealth for themselves. Smith's basic insight, that efficiency is best maximized by agents freely seeking mutually advantageous bargains, was mathematically verified in the twentieth century. However, the demonstration of this fact applies only in conditions of ‘perfect competition,’ that is, when firms face no costs of entry or exit into markets, when there are no economies of scale, and when no agents' actions have unintended side-effects on other agents' well-being. Economists always recognized that this set of assumptions is purely an idealization for purposes of analysis, not a possible state of affairs anyone could try (or should want to try) to attain. But until the mathematics of game theory matured near the end of the 1970s, economists had to hope that the more closely a market approximates perfect competition, the more efficient it will be. No such hope, however, can be mathematically or logically justified in general; indeed, as a strict generalization the assumption can be shown to be false.
This article is not about the foundations of economics, but it is important for understanding the origins and scope of game theory to know that perfectly competitive markets have built into them a feature that renders them susceptible to parametric analysis. Because agents face no entry costs to markets, they will open shop in any given market until competition drives all profits to zero. This implies that if costs and demand are fixed, then agents have no options about how much to produce if they are trying to maximize the differences between their costs and their revenues. These production levels can be determined separately for each agent, so none need pay attention to what the others are doing; each agent treats her counterparts as passive features of the environment. The other kind of situation to which classical economic analysis can be applied without recourse to game theory is that of monopoly. Here, quite obviously, non-parametric considerations drop out, since there is only one agent under study. However, both perfect and monopolistic competition are very special and unusual market arrangements. Prior to the advent of game theory, therefore, economists were severely limited in the class of circumstances to which they could neatly apply their models.
Philosophers share with economists a professional interest in the conditions and techniques for the maximization of human welfare. In addition, philosophers have a special concern with the logical justification of actions, and often actions must be justified by reference to their expected outcomes. Without game theory, both of these problems resist analysis wherever non-parametric aspects are relevant. We will demonstrate this shortly by reference to the most famous (though not the most typical) game, the so-called Prisoner's Dilemma, and to other, more typical, games. In doing this, we will need to introduce, define and illustrate the basic elements and techniques of game theory. To this job we therefore now turn.
posted by iambrianfu [ 12:38 PM ] <$BlogItemComments$>
Saturday, June 05, 2004
EASY monetary policy
A central bank policy designed to stimulate economic growth by lowering short term interest rates, making money less expensive to borrow. also called accommodative monetary policy or loose credit. opposite of tight monetary policy.
TIGHT monetary policy
A central bank policy designed to curb inflation by reducing the reserves of commercial banks (and consequently the money supply, through open market operations). also called tight money. opposite of easy monetary policy.
posted by iambrianfu [ 1:24 PM ] <$BlogItemComments$>
Tuesday, May 04, 2004
In the 1970s, mainstream thinking on unemployment converted to a new conservative orthodoxy. Now, liberals are making those ideas their own
IN 1968 Milton Friedman and Edmund Phelps, in papers written independently, assaulted the prevailing consensus on unemployment. They said it was wrong to suppose, as most economists had up to then, that governments could lower the rate of unemployment if only they would accept a little more inflation. This imagined trade-off, they argued, was a trap: governments that tolerated higher inflation in the hope of lowering unemployment would find that joblessness dipped only briefly before returning to its previous level, while inflation would rise and stay put.
Unemployment, they argued, has an equilibrium or “natural” rate, determined not by the amount of demand in the economy but by the structure of the labour market. Only one level of unemployment, the “natural” rate, is consistent with stable inflation. The notion that there is a “non-accelerating-inflation rate of unemployment”, or NAIRU, became a touchstone of free-market economic policies around the world. But now, economists who believe that governments should be more active in fighting unemployment are making the NAIRU their own.
Why did the NAIRU become identified with conservative thinking? The answer is timing. Just a few years after Messrs Friedman and Phelps unveiled their theories, many countries, including America, suffered rising inflation and rising unemployment at the same time. “Stagflation” appeared to confirm their view that macroeconomic policy could not conquer unemployment. That became conservative orthodoxy--in good part because the political mood was anyway turning against big government. As James Galbraith, a professor at the University of Texas, writes in a symposium in the current issue of the Journal of Economic Perspectives*, “Since Friedman's [paper], orthodox macroeconomics has virtually always leaned against policies to support full employment. In spite of stagnant real wages, it has virtually never leaned the other way.”
Yet it might have been otherwise. A crucial question had been neglected: whether, and under what circumstances, the natural rate might change. Increasingly, economists have concentrated on this question. As a result, by degrees, natural-rate theory has been recaptured by the centre-left.
Mr Friedman, staunch foe of the state, has consistently opposed government intervention in this area as in almost every other. But anticipating more recent work, Mr Phelps, the theory's other pioneer, has long emphasised the need for measures to lower the natural rate--essentially, by making more people employable at the prevailing level of wages. In a book to be published next month† Mr Phelps advocates an enormous programme of employment subsidies aimed both at lowering the NAIRU and at raising the incomes of the working poor.
The initial association of the natural-rate theory and conservative ideas on policy was exaggerated in another way. Suppose, to take the extreme case, that the NAIRU were fixed after all. Even then, allowing inflation to rise in order to curb unemployment may not always be undesirable. It might be worth sacrificing a permanent increase in inflation for a temporary cut in unemployment--if the increase in inflation were tiny, and if the drop in unemployment were big and prolonged (albeit not permanent). In short, the success of the natural-rate paradigm need not have been the triumph for conservatism that it was initially.
These days, liberals have all but reclaimed the ground they surrendered in the 1970s. The main thing is that nobody any longer thinks that the NAIRU is fixed, which puts the spotlight back on policies that might reduce it. In another Perspectives article, Joseph Stiglitz, until recently the chairman of President Clinton's council of economic advisers, argues that the changing demographics of the labour force, as well as more vigorous competition in the markets for goods and jobs, have contributed to a fall of some 1 1/2 percentage points in America's NAIRU since the early 1980s. He estimates it to be 5 1/2% or a little less, which also happens to be the current rate of unemployment.
Robert Gordon of Northwestern University broadly agrees. And a third set of calculations, by Douglas Staiger and James Stock of Harvard and Mark Watson of Princeton, tells roughly the same story (see chart for their estimates).
Swimming lessons
There is less agreement over whether, given that the NAIRU is so changeable, the concept has any meaning at all. Certainly, without Mr Friedman's implicit claim that the natural rate was a constant, the theory would have made far less of a stir in the first place. Nonetheless, Mr Stiglitz says the idea makes sense theoretically. And in practice, he argues, the gap between the estimated NAIRU and actual unemployment turns out to be a good predictor of changes in inflation. Even if the NAIRU moves around and cannot be measured precisely, it still provides a helpful guide to thinking about economic policy.
But praise for the brainchild of Messrs Friedman and Phelps goes only so far. Mr Stiglitz, for one, gives the natural-rate framework a markedly anti-conservative spin. In particular, he argues for a more relaxed approach to fighting inflation. The new research, he says, does indeed suggest that the costs in higher inflation of driving unemployment below the NAIRU are small. So, “in testing the waters, we do not risk drowning. If need be, we can always reverse course. But by experimenting, and showing some hesitation about restraining the economy through higher interest rates as the NAIRU draws nigh, we might learn a little more about the depth of the waters and possibly become better swimmers.”
Mr Friedman would doubtless regard this plea for flexibility in monetary policy with disdain. Giving policymakers that kind of discretion, he would say, is exactly what gets economies drowned. As the American economy continues to expand vigorously, the Federal Reserve will soon have to decide how cautiously to swim
posted by iambrianfu [ 10:37 PM ] <$BlogItemComments$>
A manifesto to raise employment
AS DEVELOPED economies emerged from their last deep recession in the early 1990s, there was high anxiety about high unemployment. Thus in 1994 the OECD set out a programme of labour-market reforms through which its member governments might cut the jobless count. A decade on, rich economies are recovering from another slowdown and unemployment is on the rise once again. On September 17th, in its annual Employment Outlook, the OECD issued a new manifesto that reflects the anxieties of western governments about jobs. But the aim this time is not just to reduce unemployment but to raise employment.
Eh? What's the difference? Plenty, in fact. To be counted as unemployed, people usually have to be part of the labour force: they must be available to work and actively looking for a job. But there are many other people of working age—housewives, students, lone parents, disabled people and early retirees—who neither have work nor seek it. These are termed the “economically inactive”. Policies to cut unemployment aim to lower it as a share of the labour force. Policies to raise employment aim to raise the proportion of the whole working-age population with jobs, not just by getting the unemployed into work but also by mobilising the economically inactive.
One reason for setting the new goal is that OECD governments have had some success in bringing down unemployment during the past decade. Generally, jobless rates have fallen; in some countries, notably Ireland, spectacularly. Some of the gains have been lost during the recent global slowdown, but the setback has been less severe than in previous downturns. Much of the long-term improvement will therefore prove sustainable, argues the OECD.
However, the emphasis on raising the employment rate also reflects new concerns, especially in Europe. A particular worry in the early 1990s was that too many young people were unemployed—ie, looking for work but unable to find it. Now, governments are concerned that there are too few young people in the labour market overall. In the European Union, there are now four people of working age for every person aged 65 or over. By 2035, this ratio will have fallen almost to two-to-one. As it falls, the burden on workers of tax-financed pension schemes will rise and public budgets will come under ever greater stress. One remedy is to raise the employment rate and so increase the number of people contributing to public pension schemes.
The rationale for cutting unemployment is straightforward. If people who want to work cannot find jobs, then potential labour resources are being wasted and taxpayers are having to support the involuntarily idle. The rationale for raising employment is less obvious. There are good reasons why students, for example, are not employed. If people choose not to work and can support themselves, what business is it of the state—however cash-strapped—to try to push them into work?
One answer is that inactivity costs taxpayers money, over and above what they must pay to support the unemployed. Besides unemployment payments, there is a wide range of other benefits, including support for early retirement, disability and lone parenthood. In the EU, there is now one person of working age receiving a benefit for every three people in employment. In America and Japan the figure is one in five. In many countries a majority of people who are neither employed nor in education get some form of income support.
Springing the trap
This benefit culture is not just a burden on the working taxpayer. It also generates incentives for people not to work. In the early 1990s, the talk was of unemployment traps, where high, long-lasting benefits dull the spur to find work. Now the OECD highlights inactivity traps where people outside the labour force face little financial incentive to seek work.
The substantial variation in employment rates between broadly similar economies suggests that in some countries there is plenty of scope for getting more inactive people into work. In Iceland, for instance, 80% of working-age women are employed; in Italy, the figure is a mere 42% (see chart). The report estimates that a convergence in working patterns would raise the average employment rate (for women and men) in OECD countries from 65% to 77%. Based on what the economically inactive say about their willingness to work, the increase would be less, to 72%.
It is easier to identify the potential gain than to realise it. The OECD sets out three main policies. The first is to make work pay for the low-skilled. To encourage more of them into the labour market, more use could be made of top-up benefits for those on low wages. Reduced payroll taxes on low-paid jobs would enhance employers' demand for such labour. The second is to remove other barriers to joining the workforce such as the difficulties women find in juggling families and jobs; for example, through subsidising child-care services. The third is to restrict the flow of people on to out-of-work benefits and to encourage those already getting them to look for jobs.
The main question is whether governments will have the courage to implement the more unpopular reforms. By extending in-work benefits, they will offer people a carrot to join the labour force; but as long as people can draw welfare payments when out of work, there will be no accompanying stick. The OECD calls for the removal of incentives to early retirement, but this requires unpopular reforms, not just to pensions but also to other benefits that allow older people to quit the workplace.
Some increase in employment rates will occur anyway as a younger generation of working women replaces an older generation that largely stayed out of the labour force. But most countries will struggle to raise the employment rate without harsh reforms. The OECD's manifesto sets a daunting challenge
posted by iambrianfu [ 10:17 PM ] <$BlogItemComments$>
Thursday, April 22, 2004
Can Alan Greenspan move the American economy smoothly towards higher interest rates?
THE chairman of the Federal Reserve may no longer have the aura of infallibility that he enjoyed in the go-go years of the 1990s, but he is still credited by many with a magic touch when it comes to monetary policy. Alan Greenspan’s most recent wizardry—a willingness to slash short-term interest rates to 1% and keep them there—helped America stave off a sharp recession and bolstered a shaky recovery. Soon he must pull off another, possibly tougher, trick: raising interest rates back to more normal levels without either undercutting the recovery (by moving too soon or fast) or letting inflation or asset bubbles get out of hand (by waiting too long).
Listen to Wall Street, and Gandalf is about to swing into action. Over the past couple of weeks a slew of strong economic statistics (especially the creation of 308,000 new jobs in March) followed by news that inflation jumped unexpectedly, has sent financial markets into a frenzy of speculation that the central bank will tighten soon—and aggressively.
Only weeks ago, it was conventional wisdom that the Federal Reserve would keep short-term interest rates unchanged until next year. After the inflation and jobs numbers were published, futures markets began to price in the near certainty of a rate hike in August. Some economists even expect the Fed to move in June, with more raises possible before Americans go the polls in November. By the end of next year many in the markets expect short-term rates to be at 3.5%.
This week the magician himself spoke, in two testimonies to Congress. Not surprisingly, his message was both more balanced and nuanced than the thunder from Wall Street. Mr Greenspan was clearly preparing the ground for tighter policy but without suggesting that a move was imminent. He noted—as he has done before—that interest rates “must rise at some point” to prevent inflation from taking off, but suggested that “as yet” the long period of low rates had not “fostered an environment in which broad-based inflation pressures appear to be building”.
Nonetheless, compared to the Fed’s last official statement on interest rates in March, Mr Greenspan was noticeably more upbeat. In March, the central bank still saw the risks of lower inflation as slightly higher than those of inflation. This week, Mr Greenspan suggested that the disinflation trend had “come to an end” and that the threats of deflation were “no longer an issue”. He described the outlook for America’s economy as “good”. Significantly, he did not use the word “patient” when discussing how long central bank could keep rates low. Though much of the testimony dwelt on why there was scant risk of inflation getting out of control, Mr Greenspan was clear that the Fed would “act, as necessary” to maintain price stability.
While Wall Street is fretting about out-of-control inflation, Mr Greenspan and his colleagues in Washington, DC, are relieved that inflation has stopped falling. Despite its recent acceleration, America’s inflation rate is extremely low. Excluding the volatile categories of food and energy, the consumer-price index rose 1.6% over the past year. Other measures preferred by the Fed are hardly thrusting skywards. Given the worries about deflation, many central bankers see a small acceleration in consumer prices as a sign of success.
One question looms: is the inflation uptick due largely to one-off factors, such as higher commodity prices, or is it the beginning of a pernicious wage-price spiral? Many Fed governors play down the risk of the latter, pointing out that the labour market is still relatively slack. Despite the March employment figures, the current recovery looks anaemic in terms of job creation; until that changes, they argue, there is scant risk of price pressure being translated into higher wages.
But the central bankers’ view of the job market seems to be changing—at least a little. A few weeks ago, when the March jobs figures came out, Roger Ferguson, the Fed’s vice-chairman, played down their significance: it would, he said, take “some time” to see whether the improvement in the labour market was “fundamental and durable”. By contrast, Mr Greenspan’s tone this week was much more upbeat. He expects the pace of hiring to stay strong.
Yet even with faster job growth, Mr Greenspan seems sanguine about inflationary pressure. There is still, he points out, a “sizeable margin” of slack in the economy; productivity growth remains strong. Fat profits also give firms room to absorb higher wages without raising prices. All those factors allow the central bank wiggle-room before rates must rise.
This analysis is not shared by all of Mr Greenspan’s colleagues on the Fed’s interest-rate-setting committee. The presidents of district central banks, in particular, tend to be more hawkish about inflation. For instance, Bill Poole, president of the St Louis Fed, is concerned about falling behind the inflation curve. He reckons that the Fed must act “aggressively” when inflation risks change. Bob McTeer, president of the Dallas Fed, said this week that the March rise in consumer prices was “disturbing”.
In practice, however, it is Mr Greenspan’s views that count most—and his goals seem to be to prepare the markets for higher rates, but not necessarily to rush to action. The reason is partly tactical: if you spend enough time preparing Wall Street for higher rates, the eventual move may not roil the highly leveraged markets.
America’s rock-bottom interest rates have not just supported spending; they have also coaxed investors into gambling on borrowed money. Raise rates unexpectedly, or too quickly, and the unwinding of these investment bets could cause chaos in financial markets. Given consumers’ dependence on housing wealth, a meltdown in the mortgage market alone could be enough to snuff out the recovery. Memories of 1994, when the Fed began a sharp tightening cycle, are still fresh. Then bond markets collapsed as investors appeared surprised by the central bank’s moves. It is an experience Mr Greenspan will not want to repeat.
In all likelihood, the Fed will send a bevy of signals before it actually raises rates. This week’s testimony is best seen as the opening salvo. But signalling is not a simple business. In June 2003, Mr Greenspan’s efforts to convey his concern about deflation backfired when the central bank failed to cut interest rates by as much as the market expected. And investors can often refuse to take the hint. Fed officials reckon that they sent clear signals about tighter policy in 1993: Wall Street ignored them.
Judging by Wall Street’s jitters, that looks unlikely to happen this time around. After Mr Greenspan’s comments this week, no one can be surprised if rates rise later this year. But there are still enormous uncertainties about how to get from today’s rock-bottom interest rates to more neutral levels. It is not just a question of when the central bank starts to raise rates, but how far they are raised and how fast. If he is to pull off his monetary magic once again, Mr Greenspan will have to become clearer on both counts.
posted by iambrianfu [ 10:29 PM ] <$BlogItemComments$>
Wednesday, April 21, 2004
Rational expectations
From Wikipedia, the free encyclopedia.
Rational expectations is a theory in economics used to model expectations of future events, proposed by John F. Muth (1961). Modelling expectations is of central importance in economic models. For example, a firm's decision on the level of wages to set in the coming year will be influenced by the expected level of inflation, and the value of a stock is dependent on the expected future income from that stock.
Under rational expectations, it is assumed that actual outcomes do not differ systematically or predictably from what was expected. That is, it assumes that people do not make systematic errors when predicting the future. In an economic model, this is typically modelled by assuming that the expected value of a variable is equal to the value predicted by the model, plus a random error term.
Rational expectations theories were developed in response to perceived flaws in theories based on adaptive expectations. Under adaptive expectations, expectations of the future value of an economic variable are based on past values. For example, people would be assumed to predict inflation by looking at inflation last year and in previous years. This means that if the government were to choose a policy that led to constantly rising inflation, under adaptive expectations people would be assumed to always underestimate inflation. This may be regarded as unrealistic - surely rational individuals would sooner or later realise the government's policy and take it into account in forming their expectations?
The hypothesis of rational expectations addresses this criticism by assuming that individuals take all available information into account in forming expectations. Though expectations may turn out incorrect, the deviations will not deviate systematically from the expected values.
The rational expectations hypothesis has been used to support some controversial conclusions for economic policymaking. The hypothesis is often critised as an unrealistic model of how expectations are formed.
posted by iambrianfu [ 10:05 PM ] <$BlogItemComments$>
NAIRU
NON-ACCELERATING RATE OF UNEMPLOYMENT
From Wikipedia, the free encyclopedia.
The term NAIRU is an acronym for Non-Accelerating Inflation Rate of Unemployment. It is a concept in economic theory significant in the interplay of macroeconomics and microeconomics.
The concept arose in the wake of the study of the Phillips curve which demonstrated an observed negative correlation between the rate of unemployment and the rate of inflation (measured as annual nominal wage growth of employees) for a number of industrialised countries with more or less mixed economies. This correlation persuaded some analysts that it was impossible for govenments simulaneously to target both arbitrarily low unemployment and price stability, and that, therefore, it was government's role to seek a trade off between unemployment and inflation which matched a local social consensus.
Critics of this analysis argued that the Phillips curve could not be a fundamental of economic equilibrium because it showed a correlation between a real economic variable and a nominal economic variable. Their counter analysis was that government macroeconomic policy (primarily monetary policy) was being driven by an unemployment target and that this caused expectations to shift so that steadily accelerating inflation rather than reduced unemployment was the result. The resulting prescription was that government economic policy (or at least monetary policy) should not be influenced by any level of unemployment below a critical level - the NAIRU.
The NAIRU theory mainly intended as an argument against active Keynesian demand management and in favor of free markets. There is for instance no theoretical basis for predicting the NAIRU. Monetarists instead support the generalized assertion that the correct approach to unemployment is through microeconomic measures (to lower the NAIRU whatever its exact level), rather than macroeconomic activity based on an estimate of the NAIRU in relation to the actual level of unemployment.
posted by iambrianfu [ 9:59 PM ] <$BlogItemComments$>
ROBERT LUCAS AND RATIONAL EXPECTATIONS
An even bigger attack on Keynesianism came from Robert Lucas, the founder of a theory called rational expectations. (1) This highly mathematical theory dominated all economic thought in the 70s and early 80s, so much so that Lucas attracted a broad following of disciples who raised to him to cult-leader status. By 1982, Lucas' views were so entrenched that Edward Prescott of Carnegie-Mellon University would boast that his students had never heard Keynes' name.
Lucas won the Nobel Prize in 1995 for the core aspect of his theory, that rational businessmen adjust their behavior to the government's announced economic policies. However, history has not been kind to the rest of his theory. Lucas himself has abandoned work on rational expectations, devoting himself nowadays to other problems, like economic growth. His once broad following has dissipated. And Lucas himself would admit upon receiving his Nobel prize: "The Keynesian orthodoxy hasn't been replaced by anything yet." (2)
There are two main parts to rational expectations. First, Lucas began with the old assumption that recessions are self-correcting. Once people start hoarding money, it may take several quarters before everyone notices that a recession is occurring. That's because people recognize their own hardships first, but it may take awhile to realize that the same thing is happening to everyone else. Once they do recognize a general recession, however, their confusion clears, and the market quickly takes steps to recover. Producers will cut their prices to attract business, and workers will cut their wage demands to attract work. As prices deflate, the purchasing power of the dollar is strengthened, which has the same effect as increasing the money supply. Therefore, government should do nothing but wait the correction out.
Second, government intervention can only range from ineffectualness to harm. Suppose the Fed, looking at the leading economic indicators, learns that a recession has hit. But this information is also available to any businessman who reads a newspaper. Therefore, any government attempt to expand the money supply cannot happen before a businessman's decision to cut prices anyway. Keynesians are therefore robbed of the argument that perhaps the Fed might be useful in hastening a recovery, since Lucas showed that the Fed is not much faster than the market in discovering the problem.
Lucas then gave a fuller and more supported version of Milton Friedman's argument. Suppose the Fed established a predictable anti-recession policy: every time the unemployment rate climbs one percent, the Fed increases the money supply one percent. Rational businessmen would only come to expect these increases -- hence the term, rational expectations -- and would simply build automatic responses to monetary policy in their pricing systems. So in order to be effective, then, monetary policy would have to surprise businesses with random increases. But true randomness would make the economy less stable, not more so. The only logical conclusion is that the government's efforts to control the economy can be actually harmful.
Rational expectations borrowed heavily from earlier conservative theories, but Lucas supported these arguments mathematically, and in far greater detail and nuance. In fact, rational expectations spawned many new mathematical and statistical techniques, and allowed a generation of economists to specialize in these techniques. In a typical rational expectations model, the public adjusts its behavior to announced monetary policy. This is supposed to result in a more realistic description of the economy.
But despite its technical brilliance, today we know there are several major flaws in the theory.
First, it is not reasonable to believe that business owners generally determine their prices by following macroeconomic trends. Can you cite the Federal Reserve's rates and policies at the moment? The inflation and unemployment rates? The nation's GDP growth? Even more improbably, do you set your budget, prices and wage demands by these indicators? Only an economist (who knows all these statistics anyway) would think this is natural behavior.
Second, it is not reasonable to believe that humans are perfectly rational or perfectly informed. Much of Lucas' theory "worked" only after making such idealistic assumptions. Interestingly, Lucas and his followers have usually defended these assumptions by attacking their opposite. Early Keynesians had "overlooked" the fact that people would adjust their behavior to national economic policy. Here are some typical criticisms of this oversight by Lucas and others:
"The implicit presumption in these Keynesian models was that people could be fooled over and over again." -- Robert Lucas (3)
"The problem with the old models was that they assumed people were as dumb as dirt and could be fooled by the government into changing their behavior." -- Paul Romer (4)
"The essence of rational expectations could be summarized as 'people aren't as dense as policy makers used to think they were.'" -- Ron Ross (5)
The black and white universe of the Lucas school is rather amusing: if human beings are not walking calculators or walking statistics manuals, then, by God, Keynesians must think they are complete idiots. The truth is a bit more prosaic. Keynesians have primarily based their theories on historical evidence, not assumptions of citizen ignorance. Money has never deflated easily, for whatever reason, and Keynesian policies seem to work best in smoothing out the business cycle. The failure to deflate may spring from many sources: not just citizen ignorance of monetary policy, but certainly price rigidities as well. For example, during the severe recession of 1982, the Fed's proposal to expand the money supply was widely debated in the press. When the Fed did move, it was well announced. But rational and expectant businessmen did not raise their prices and create inflation. On the contrary, 1983 actually saw lower inflation, as well as a job-creation boom that would last seven years.
And this leads to the third flaw: Lucas's theory just doesn't explain reality very well. In an article entitled "Great Theory… As Far as it Goes," economist Michael Mandel writes: "Unfortunately, models built on rational expectations do not reflect the real world as well as the old Keynesian models they were supposed to replace… Most economic forecasting models still have a Keynesian core." (6)
To wit, the recessions of 80-82 and 90-92 behaved very differently from what Lucas's models predicted. Keep in mind their key assertion that recessions only happen because individuals are temporarily confused about the situation. Once workers realize they are in a general recession, they will cut their wage demands, which will restore full employment. But after the unemployment rate hit 10.7 percent in the winter of 1982, it took until 1987 for it to recover to 1979 levels (about 5-6 percent). Did it really take workers eight years to figure out they were in a recession before cutting their wage demands by the necessary amount? Of course not.
The main obstacle to Lucas's theory is that that recessions last far too long to attribute them to temporary public confusion about the situation. Jimmy Carter was voted out of office for a "misery index" (inflation plus unemployment) that crested 20 percent. Yet it wasn't until the second year of Reagan's term that a recovery started. The same with George Bush -- a recession struck in 1990, and his 90 percent approval rating took a free fall in the election campaign that followed. That campaign was defined by James Carville's slogan, "It's the economy, stupid." The economy did not truly start recovering until 1992, and employment took even longer to recover. If the public's awareness of recessions is great enough to drive presidents out of office after extended campaigns, it's clear that people understand their plight. But then why do recessions last so long?
Lucas and his followers searched everywhere for a model that would keep businessmen aware of leading economic indicators and yet ignorant of the fact that they were in a recession. Needless to say, they did not find one.
Life after Lucas
Around the mid-80s, economists became aware of the theory's shortcomings. One of these was probably Lucas himself, who never really bothered to defend it when journals began seriously questioning it. As Lucas moved on to other projects, conservatives economists found themselves back at the drawing board, asking themselves such basic questions as: what is a recession?
The existence of recessions has always been an embarrassment to economists on the right. Recessions suggest that markets are not magical, that they often result in hardship and suffering. One far-right tack, as exemplified by the Austrian School of Economics, is to blame them on government. But depressions were exclusively a feature of laissez-faire economies; since the rise of modern welfare states, nations have seen greatly reduced recessions and unprecedented economic growth. So much for the government scapegoat.
The opposite tack, as exemplified by "real business cycle theory," has been to claim that recessions are actually beneficial self-cleansing rituals of the market. Thanks to recessions, the market adapts to change. This idea has been satirized in such Keynesian articles as Willem Buiter's "The Economics of Doctor Pangloss," after Voltaire's fictional philosopher who believes everything is for the best. As Paul Krugman remarked: "If recessions are a rational response to temporary setbacks in productivity, was the Great Depression really just an extended, voluntary holiday?" (7)
Real business cycle theory was presented as a replacement to rational expectations, but it ended up imploding under self-criticism and ideological infighting. Unlike monetarism or rational expectations, it never became a serious movement.
Instead, economics has seen the revival of Keynes, with the emergence of New Keynesianism. Many people thought that Lucas had refuted Keynesianism as a matter of principle. Any businessman with a newspaper would learn of a recession as quickly as the Fed, and would nullify the Fed's actions with the appropriate counteractions. However, economist George Akerlof would undermine this "refutation" with his own seminal article, "The Market for Lemons." Akerlof made two crucial observations, one of them obvious, one of them not-so-obvious.
The obvious one was that human beings are nearly rational, not perfectly rational. The not-so-obvious one is that nearly rational people may make decisions that approximate those of perfectly rational people, yet still obtain completely different economic results. Two examples best illustrate this point:
Suppose a farmer is selling wheat on the market, and notices that demand drops. Every other farmer is selling wheat for five cents a bushel less than he is. Now, wheat is a homogenous product, meaning that one farmer's wheat is virtually identical to another farmer's wheat. There is no advantage for a buyer to buy wheat at a higher price, so it would be foolish for the farmer to stay a nickel above the competition. The farmer may not want to lower his prices, but the market's supply and demand will force him to. So although the farmer is only nearly rational, he can come to perfectly rational decisions when the product is a homogenous one.
But what if the product is not homegenous? What if you're selling artwork? Artwork can range from a first-grader's finger-painting to "Ambroise Vollard" by Picasso. You might have a general idea of what's it worth, but to figure it to the penny, like wheat, is impossible. Too many variables affect the final price. One art collector might want the Picasso more than another collector, and be willing to pay more for it during an auction. A movie or book about Picasso might spur unusual interest in his paintings over those of Monet. A rich idiot might come along who has no idea what it's really worth. Therefore, an art seller is not being irrational by refusing to sell it to someone in the hopes of gaining a higher price. In other words, there is a range of acceptable prices, and most people hold out for higher prices out of self-interest.
The vast majority of goods on the market are not homogenous: examples include cars, homes, even workers on the labor market. People may be fully aware of a recession when they are in one, but they do not know how much they should reduce their prices. To know the exact percentage would require an astronomical amount of information and calculating ability. The cost of arriving at such a calculation would surely outweigh its benefits. Therefore, people should -- as a matter of principle -- try to make best guesses. Unfortunately, this often results in the sort of price stickiness that prevents recessions from curing themselves. Keynesian policies therefore remain a useful tool in cutting recessions short. We have long known that this is so in practice; it's heartening to know that this is so in theory now as well.
Next Section: Ronald Coase and the Coase Theorem
Return to Main Page
Endnotes:
1. Unless otherwise noted, this essay is primarily based on Paul Krugman, Peddling Prosperity (New York: W.W. Norton & Company, 1994), pp. 47-53, 197-205.
2. Steven Pearlstein, "Chicago Economist Wins Nobel Prize: Lucas has Focused on Theoretical Issues," San Jose Mercury News, Wednesday, October 11, 1995, p. 4A.
3. "Economics dynasty continues: Robert Lucas wins Nobel Prize," Chicago Journal, The University of Chicago Magazine, December, 1995.
4. Quoted in Steven Pearlstein, "Chicago Economist Wins Nobel Prize: Lucas has Focused on Theoretical Issues," San Jose Mercury News, Wednesday, October 11, 1995, p. 4A.
5. Ron Ross, "Anticipation," The North Coast Journal, December, 1995.
6. Michael Mandel, "Great Theory… As Far as it Goes," Business Week, October 23, 1995, p. 32.
7. Krugman, p. 204.
posted by iambrianfu [ 7:58 PM ] <$BlogItemComments$>