Showing posts with label macroeconomics. Show all posts
Showing posts with label macroeconomics. Show all posts

Thursday 26 March 2020

Any letter you like - except V


It has become clear in recent days just how much damage the sudden stop in economic activity triggered by the COVID-19 crisis is likely to cause. The services PMIs across Europe collapsed sharply in March with the UK recording an all-time low on data back to 1996 which represents five standard deviations from the mean. The release of today’s US initial jobless claims data showed an extraordinary rise which was 33 standard deviations from the mean on weekly data back to 1967 (chart).  This is an economic collapse the likes of which none of us has seen. It can be likened to the economic equivalent of hitting a wall at high speed: Not only does the car get smashed up but it will take time to recover from any injuries sustained.

It is for this reason that my doubts about a V-shaped recovery continue to mount. The BoE noted in the UK context that “given the severity of that disruption, there is a risk of longer-term damage to the economy, especially if there are business failures on a large scale or significant increases in unemployment.” Many companies have effectively been forced to cease business as a result of the lockdown implemented earlier this week, which will have major implications for cashflow, and a number of them may not resume trading when restrictions are finally lifted. To give some idea of how much the collapse in spending is going to impact on the economy, UK restaurant bookings have slumped to zero over the second half of March based on data from Open Table. On average, restaurant traffic is 55% below levels in March 2019 and with a weight of 9.5% in total spending this alone will reduce consumer spending by 5% relative to a year ago. Assuming restaurants do not reopen in April, the drag on annual spending will double. Notions of a 10% collapse in Q2 GDP suddenly do not look so fanciful.

A measure of the swing in expected 2020 economic growth against the 2019 outturn gives us an idea of where the Covid-19 effect is likely to hit hardest. Assuming Italian GDP falls by 5% this year following a small gain of 0.3% last year this produces a total swing of 5.3 percentage points. By contrast, Chinese growth is only expected to slow from around 6% to 4%, producing a total swing of 2 percentage points. In the UK I currently expect something like a 4.5 percentage point swing. Naturally, forecasts at this stage are little more than guesswork so we should not read too much into the numbers, but given the severity of the crisis in Italy it is reasonable to assume it will take a major economic hit.

Obviously we have no clear idea about the duration of the crisis but the standard assumption is that the bulk of any output contraction is concentrated in the first half of the year. That is itself a huge assumption, but even if turns out to be true, many businesses will not survive the current hit despite the huge amount of support that the government is prepared to give. This will prove to be a searing economic experience for many of us, and that is without discussing the human costs associated with coronavirus. But it will not only be small firms that change their behaviour. Large firms will also be more circumspect given the impact that this year’s recession will have on earnings which will act to hold back investment, and in any case they are likely to hold off on big spending plans until they are confident that demand is once again on an upward path.

It is, of course, too easy to extrapolate the bad news out into the future without taking account of the resilience shown by western economies. On the basis of what we know about the coronavirus now, there will be an economic recovery and probably sooner rather than later. But to give some idea of the impact of economic shocks, I looked at the three major recessions in the UK over the past 40 years and discovered that on average it takes almost four years for output to recover its pre-recession peak. An output collapse of 4% this year followed by three years of trend growth of 1.5% indeed means the usual four year cycle will hold. Matters will be all the more difficult for many western economies given that potential growth today is far slower than prior to 2008 as a consequence of an ageing population and the sluggish nature of productivity growth over the past decade. Recall that in the wake of the Lehman’s collapse it took an awful long time to be convinced that the economy had turned the corner. I suspect the same may also happen this time around.

That said, the 2020 recession will be a catalyst to revisit many areas of the economy that have been ignored over the past decade and I will deal with them in more detail another time. But to throw out a few ideas at random, we are likely to find ourselves paying a lot more attention to the lessons of Keynes than we did a decade ago. Whatever else we take away from the 2020 experience, it is that there is a role for the state as an agent of last resort to step in when there is deficient demand. State capitalism is thus going to be higher up the agenda in many countries and it will be difficult for governments to introduce austerity programmes when this is all over. After all, we have had a decade of it in the UK and where has it got us? We will also have to have a public debate about how much state we want and how we plan to pay for it. I suspect the era of tax cutting may be over for a long time to come.

However, the future can make fools of us all so we should revisit some of this blue sky thinking in future to see whether it stacks up. Nonetheless I am struck by the fact that just as the post-1945 era was very different to the pre-1939 world, so we might look back at the unprecedented events of 2020 as the point at which the world economy changed.

Thursday 21 November 2019

Labouring under an illusion

The release of the Labour Party’s election manifesto today was a big deal. It has been described across much of the mainstream media as representing a swing to the left, proposing a significant increase in the role of the state including a big nationalisation programme and a major increase in government investment. It certainly represents a radical departure from the conventions of British politics over the last forty years by proposing fewer market solutions to economic management issues. As regular readers of this blog will know, I have long argued that the economic model of the past four decades, in which the market is predominant whilst the state plays a subordinate role, has run its course. But the plans presented by the Labour Party arguably represent a swing of the pendulum too far in the other direction.

Whatever one’s views, however, an impressive amount of work has been put into the economic plan. We can roughly divide it into two areas – redistribution and investment. Turning first to the redistributive element, the document outlining Labour’s funding plans is a serious piece of analysis, the likes of which I do not recall seeing in an election manifesto. It entails a significant medium-term increase in current outlays on areas such as education, health and social care and work and pensions. Total current outlays by fiscal year 2023-24 are projected to be £83bn higher than measures announced in all previous fiscal events – a 20% increase over current plans (on the narrow definition of spending outlined in the document - it represents an increase of around 10% in total spending).

In fairness, Labour has gone to great lengths to explain how this increase will be funded. Around a quarter is to be generated from raising the corporate tax rate from its current level of 19% to 26% by FY 2023-24 (bringing in £23.7bn). The next largest chunk comes from raising capital gains tax and dividend taxation in line with income taxation (yielding £14bn), followed by £8.8bn from a financial transactions tax. Of the other headline grabbing items, income taxes will be raised on those earning more than £80,000 per year (roughly three times the average wage) which is expected to yield about £5.4bn of revenue. It is notable too, that the plans attempt to allow for changes in behaviour in response to higher taxes so the figures quoted are net expected yield, rather than simply the gross yield. Like the underlying ethos or not, I thought that the funding plan was an impressive piece of economic analysis that people have put a lot of thought into.

The investment spending side of the plan confirmed the expected boost of £55bn per annum (2.5% of GDP). In effect, Labour intends to borrow only to fund investment with the redistributive element of the plan funded by higher taxes. Consequently, the simulation analysis I recently conducted on Labour’s plans still holds. Assuming that in the medium-term Labour injects £55bn per year into the economy, my analysis suggests this will raise the public deficit from 2.5% of GDP in the baseline to around 3.9% by FY 2023-24 (chart) and raise real GDP by 1.8 percentage points above the baseline. Incidentally, this implies a fiscal multiplier of around 0.75 (i.e. a fiscal boost of one percentage point of GDP increases output by 0.75%) which is not far out of line with estimates produced in an OECD paper in 2016 (and cited in Labour’s document).

In order to assuage market concerns regarding its plans, Labour has proposed a fiscal credibility rule which will (i) eliminate the current budget deficit within five years; (ii) maintain interest payments below 10% of tax revenue and (iii) improve the strength of the government’s balance sheet during the next parliament. Part (i) is, in my view, achievable but (ii) will depend very much on how markets decide to set interest rates. Part (iii) is economically very interesting. Labour recognises that there are significant costs associated with its nationalisation plans but in buying up companies the state also acquires an asset. It thus proposes targets that take into account the net balance sheet position of such transactions. The idea is based on work by the Resolution Foundation (here) and it is a genuine fiscal innovation. There are indeed good reasons for incorporating it into the fiscal framework since the public sector balance sheet is increasingly a tool of macroeconomic management (an approach pioneered by central banks in recent years). 

But just because the plan is interesting and innovative does not mean it is sensible. It is a good old-fashioned soak-the-rich strategy, allied with a plan to tax financial institutions and the corporate sector. Paul Johnson of the well respected Institute for Fiscal Studies said in a radio interview that the manifesto will produce “just about the most punitive corporate tax regime in the world”. It will crucify the City of London, where financial services generate 40% of the surplus on services trade and which in turn offsets a large part (though not all) of the deficit in goods trade. Simply put, the UK will be a far less attractive business location. A combination of this economic plan and Brexit would undo decades of work to improve the UK’s standing as a business-friendly location (although Labour does promise to put any Brexit deal it secures with the EU to a public vote and include an option to remain).

It appears from the latest opinion polls that Labour has little chance of getting sufficiently close to the levers of power to actually implement its plans. But it will move the dial. The electorate has had enough of the austerity forced on them over the last decade and the Conservatives will be forced to respond with a policy which also implies additional public spending. As even the FT’s economics correspondent Chris Giles pointed out in an article today  “taxes cannot be something that other people pay.” If the UK is serious about improving the quality of public services, notably the sacred cow which is the NHS, taxes will have to rise. But everyone has to make some contribution and it is dishonest to suppose that only big companies and the rich should pay the taxes that everyone else benefits from. 

It is right that we have a proper debate about the role of the state in the economy. The benefits of a low tax, light-touch regulation regime worked for a long time but in the wake of 2008 the limits of this system were shown up. It’s just that Labour’s 1970s-style socialism is not the way to go either.

Wednesday 24 April 2019

A retrospective on macro modelling

Anyone interested in the recent history of economics, and how it has developed over the years, could do worse than take a look at the work of Beatrice Cherrier (here). One of the papers I particularly enjoyed was a review of how the Fed-MIT-Penn (FMP) model came into being over the period 1964-74, in which she and her co-author, Roger Backhouse, explained the process of constructing one of the first large scale macro models. It is fascinating to realise that whilst macroeconomic modelling is a relatively easy task these days, thanks to the revolution in computing, many of the solutions to the problems raised 50-odd years ago were truly revolutionary.

I must admit to a certain nostalgia when reading through the paper because I started my career working in the field of macro modelling and forecasting, and some of the people who broke new ground in the 1960s were still around when I was starting out in the 1980s. Moreover, the kinds of models we used were direct descendants of the Fed-MIT-Penn model. Although they have fallen out of favour in academic circles, structural models of this type are in my view still the best way of assessing whether the way we think the economy should operate is congruent with the data. They provide a richness of detail that is often lacking in the models used for policy forecasting today and in the words of Backhouse and Cherrier, such models were the “big science” projects of their day.

Robert Lucas and Thomas Sargent, both of whom went on to win the Nobel Prize for economics, began in the 1970s to chip away at the intellectual reputation of structural models based on Keynesian national income accounting identities for their “failure to derive behavioral relationships from any consistently posed dynamic optimization problems.” Such models, it was argued, contained no meaningful forward-looking expectations formation processes (true) which accounted for their dismal failure to forecast the economic events of the 1970s and 1980s. In short, structural macro models were a messy compromise between theory and data and the theoretical underpinnings of such models were insufficiently rigorous to be considered useful representations of how the economy worked.

Whilst there is some truth in this criticism, Backhouse and Cherrier remind us that prior to the 1970s “there was no linear relationship running from economic theory to empirical models of specific economies: theory and application developed together.” Keynesian economics was the dominant paradigm, and such theory as there was appeared to be an attempt to build yet more rigour around Keynes’ work of the 1930s rather than take us in any new direction. Moreover, given the complexity of the economy and the fairly rudimentary data available at the time, the models could only ever be simplified versions of reality.

Another of Lucas’s big criticisms of structural models was the application of judgement to override the model’s output via the use of constant adjustments (or add factors). Whilst I accept that overwriting the model output offends the purists, it presupposes that economic models will outperform relative to human judgement. But such an economic model has not yet been constructed. Moreover, the use of add factors reflects a certain way of thinking about modelling the data. If we think of a model as representing a simplified version of reality, it will never capture all the variability inherent in the data (I will concede this point when we can estimate equations, all of which have an R-bar squared close to unity). Therefore, the best we can hope for is that the error averages zero over history – it will never be zero at all times. 

Imagine that we are in a situation where the last historical period in our dataset shows a residual for a particular equation which is a long way from zero. This raises a question of whether the projected residual in the first period of our forecast should be zero. There is, of course, no correct answer to the question. It all boils down to the methodology employed by the forecaster – their judgement – and the trick to using add factors is to project them out into the future so that they minimise the distortions to the model-generated forecast.

But to quote Backhouse and Cherrier, “the practice Lucas condemned so harshly, became a major reason why businessmen and other clients would pay to access the forecasts provided by the FMP and other macroeconometric models … the hundreds of fudge factors added to large- scale models were precisely what clients were paying for when buying forecasts from these companies.” And just to rub it in, the economist Ray Fair ”later noted that analyses of the Wharton and Office of Business Economics (OBE) models showed that ex-ante forecasts from model builders (with fudge or add factors) were more accurate than the ex-post forecasts of the models (with actual data).

Looking back, many of the criticisms made by Lucas at al. seem unfair. Nonetheless, they had a huge impact on the way in which academic economists thought about the structure of the economy and how they went about modelling it. Many academic economists today complain about the tyranny of microfoundations, in which it is virtually impossible to get a paper published in a leading journal without linking models of the economy to them. In addition, the rational expectations hypothesis has come to dominate in the field of macro modelling, despite the fact there is little evidence suggesting this is how expectations are in fact formed.

As macro modelling has developed over the years, it has raised more questions than answers. One of the more pervasive is that, like the models they superseded, modern DSGE models have struggled to explain bubbles and crashes. In addition, their treatment of inflation leaves a lot to be desired (the degree of price stickiness assumed in new Keynesian models is not evident in the real world). Moreover, many of the approaches to modelling adopted in recent years do not allow for a sufficiently flexible trade-off between data consistency and theoretical adequacy. Whilst recognising that there are considerable limitations associated with structural models using the approach pioneered by the FMP, I continue to endorse the view of Ray Fair who wrote in 1994 that the use of structural models represents "the best way of trying to learn how the macroeconomy works."

Wednesday 17 April 2019

Inflation beliefs

One of the biggest apparent puzzles in macroeconomic policy today is why inflation remains so low when the unemployment rate is at multi-decade lows. The evidence clearly suggests that the trade-off between inflation and unemployment is far weaker today than it used to be or, as the economics profession would have it, the Phillips curve is flatter than it once was (here). But as the academic economist Roger Farmer has pointed out, the puzzle arises “from the fact that [central banks] are looking at data through the lens of the New Keynesian (NK) model in which the connection between the unemployment rate and the inflation rate is driven by the Phillips curve.” But what if there were better ways to characterise the inflation generation process?

Originally, the simple interpretation of the Phillips curve suggested that policymakers could use this trade-off as a tool of demand management – targeting lower (higher) unemployment meant tolerating higher (lower) inflation. However, much of the literature that emerged in the late-1960s/early-1970s suggested that demand management policies were unable to impact on unemployment in the long-run and that it was thus not possible to control the economy in this way. The reason is straightforward – efforts by governments (or central banks) to stimulate the economy might lead in the short-run to higher inflation, but repeated attempts to pull the same trick would result in a response by workers to push for higher wages which in turn would choke off labour demand and raise the unemployment rate. In summary, government attempts to drive the unemployment rate lower would fail as workers’ inflation expectations adjusted. One consequence of this is that the absence of any such trade-off implies the Phillips curve is vertical in the longer-term (see chart).

Another standard assumption of NK models, which are heavily used by central banks, is that inflation expectations are formed by a rational expectations process. This implies some very strict assumptions about the information available to individuals and their ability to process it. For example, they are assumed to know in detail how the economy works, which in modelling terms means they know the correct structural form of the model and the value of all the parameters. Furthermore they are assumed to know the distribution of shocks impacting on the economic environment. Whilst this makes the models intellectually tractable, it does not accord with the way in which people think about the real world.

But some subtle differences to the standard model can result in significant changes to the outcomes, which we can illustrate with regard to some recent interesting work by Roger Farmer. In a standard NK model the crucial relationship is that inflation is a function of expectations and the output gap, and produces the expected result that the long-run Phillips curve is indeed vertical. But Farmer postulates a model in which the standard Phillips curve is replaced by a ‘belief’ function in which nominal output in the current period depends only on what happened in the previous period (known as a Martingale process). Without going through the full details (interested readers are referred to the paper), the structure of this model implies that policies which affect aggregate demand do indeed have permanent long-run effects on the output gap and the unemployment rate, which is in contrast to the standard NK model. Moreover, Farmer’s empirical analysis suggests that the results from models using belief functions fit the data better than the results derived from the standard model.

The more we think about it, the more this structure makes sense. Indeed, as an expectations formation process, it is reasonable to assume that what happened in the recent past is a good indicator of what will happen in the near future (hence a ‘belief’ function). Moreover, since in this model the target of interest (nominal GDP) is comprised of real GDP and prices, consumers are initially unable to distinguish between real and nominal effects, even though any shocks which affect  them may have very different causes. In an extreme case where inflation slows (accelerates) but is exactly offset by a pickup (slowdown) in real growth, consumers do not adjust their expectations at all. In the real world, where people are often unable to distinguish between real and price effects in the short-term (money illusion), this appears intuitively reasonable.

All this might seem rather arcane but the object of the exercise is to demonstrate that there is only a “puzzle” regarding unemployment and inflation if we accept the idea of a Phillips curve. One of the characteristics of the NK model is that it will converge to a steady state, no matter from where it starts. Thus lower unemployment will lead to a short-term pickup in wage inflation. Farmer’s model does not converge in this way – the final solution depends very much on the starting conditions. As Farmer put it, “beliefs select the equilibrium that prevails in the long-run” – it is not a predetermined economic condition. What this implies is that central bankers may be wasting their time waiting for the economy to generate a pickup in inflation. It will only happen if consumers believe it will – and for the moment at least, they show no signs of wanting to drive inflation higher.

Sunday 17 March 2019

MMT: Modern Monetary Theory or Mad Macro Tosh?

John Maynard Keynes’ General Theory of Employment, Interest and Money came about as a direct response to the economic conditions of the Great Depression which he attributed to deficient demand. His work, and that of his followers, demonstrated that cyclical variations could be dampened by greater government intervention to smooth out movements in the economic cycle. But by the late-1970s, the world was growing weary of recessions, high inflation and rising unemployment and was looking for alternative policy options to the statist economic model that had produced them. Thus was the free market economic revolution born. Fast forward thirty years to the fallout from the crash of 2008 and questions have increasingly been raised about the role of free market economics in producing the biggest economic slump in 80 years and the perception that economic and social inequality has widened.

One of the candidates for a new economic theory for the 21st century is Modern Monetary Theory (MMT) which has been around for a while but is now attracting a huge amount of attention in the US. As might be expected, the pendulum has swung completely, with the latest policy theory attempting to move from the fringes to the mainstream espousing a much bigger role for government. It has increasingly been adopted by those on the left of the political spectrum as a policy which could justify a big increase in government spending. The likes of Alexandria Ocasio-Cortez, the high-profile US politician who is making a name for herself in the Democratic Party, has argued that her proposed expansionary fiscal agenda can be justified by MMT in which a rising deficit does not impose major constraints on the US economy.

MMT’s big thing: The lack of a government budget constraint

So what exactly is MMT? It starts from the premise that the government is the sovereign supplier of money. Consequently, there is no such thing as a government budget constraint because governments can finance their deficits by creating additional liquidity at zero cost when the economy is running below full employment. Even when the economy is operating above full employment, although there are some inflationary consequences, the budget constraint is still not regarded as an issue.

In conventional economics, the intertemporal budget constraint implies that if a government has some existing debt, it must run surpluses in the future so that it can ultimately pay off that debt. In formal terms, current debt outstanding is equal to the discounted present value of future primary surpluses. MMT gets around this problem by arguing that since the government is the sovereign supplier of money, it will always be able to generate the liquidity to cover any debt obligations. This implies that a sovereign government that issues debt in its own currency can never go bust. There is nothing controversial in that proposition per se – indeed, it is one of the arguments I have long used to refute rating agencies’ concerns about UK fiscal solvency. However, there are some major reservations.

MMT makes a big thing about the government’s monopoly in monetary creation. But governments need not be the sole supplier of money, as the recent Bitcoin debate has illustrated. It just happens to be the most convenient form. Second, all students of economic history are aware of past experience when unlimited monetary creation resulted in hyperinflation. This in turn gives economic agents an incentive to find alternative forms of money that will maintain their value. Third, whilst it is true that governments will always be able to repay their local currency debt, it does not justify continually expanding the deficit without limit. In what can be thought of as the “when you’re in a hole, stop digging” theory, governments have to be aware of the extent to which there will always be willing buyers of debt. If one government expands its deficit without limit but another is more prudent, bond investors will always favour the more prudent debt issuer. 

It claims to offser insights that standard Keynesian analysis has missed 

Indeed, the more closely we look at MMT the more we realise that some of its key underpinnings do not stand up to scrutiny. One of the claims made by its proponents is that it offers new insights that standard Keynesian analysis has missed. This is an overstatement. MMT appears to claim that Keynesian analysis failed to recognise that governments could finance themselves by issuing money and that budget surpluses reduce private sector holdings of high-powered money (that which is issued by the monetary authority). Both of these claims are false as both can be inferred from standard Keynesian ISLM models. There is also nothing new in the claim that if the private sector wants to save more (less) than it invests, government must run a deficit (surplus). Again, this is standard national income identity stuff.

MMT claims to differ from standard Keynesian analysis in that it “does not rely on increasing aggregate demand in order to reach full employment; it disconnects full employment from economic growth[1].” It does so by engaging in “targeted spending that is designed to improve the structure of the labor market by developing a pool of employable labor while at the same time ensuring continuous employment of those ready and willing to work.” Critics such as Thomas Palley[2] point out that there are no theoretical underpinnings as how this might work which makes it hard to validate the intellectual argument. My reading of this approach is that it merely represents a choice by government as to how to use the resources at its disposal (which MMT proponents argue are unlimited).

But its treatment of inflation is hazy

What is new is the claim that it is possible to create higher employment without generating inflation. But MMT lacks a well-defined inflation process which makes it difficult to validate this claim. Labour markets operate on the basis of the supply-demand principle and if there are labour shortages in key areas there will be higher wage inflation, particularly where there are structural impediments such a high concentration of trade union membership. Some MMT proponents do not accept that this Phillips curve-type world exists but offer no alternative inflation-generation model. There are some who do allow for such a process but they then cannot therefore escape the fact that they have introduced a trade-off between wage inflation and unemployment, even in today’s flatter Phillips curve world. 

It seems to me that much of the analysis relies on the assumption that a wise government planner will be able to determine in advance where bottlenecks in the economy will arise and that offsetting action can be taken. But since I am not convinced that macroeconomists fully understand the inflation creation process (here) I rather suspect this may be a forlorn hope. 

Desperate times call for desperate measures by desperate governments 

I have long been an advocate of using fiscal policy as a tool of demand management, so an attempt to identify a coherent policy that ascribes a role for government should not be dismissed. I am just not convinced that MMT lives up to (m)any of the claims which are made for it. Palley argues that “it is a policy polemic for depressed times. A policy polemic that promises full employment and price stability at little cost will always garner some attention … such a policy polemic will be especially attractive in depressed times.” It is widely dismissed as being neither modern nor a theory. There again George H. W. Bush initially dismissed Ronald Reagan’s supply-side policies as “Voodoo Economics” before he signed up to them as Reagan’s running mate. Just because it is flawed may not prevent governments desperate for alternative policy measures from trying it out.




[1] Tymoigne, L. and L. R. Wray (2014) ‘Modern Money Theory 101: A Reply to Critics’ Levy Economics Institute Working Paper 778
[2] Palley, T. (2014) ‘The critics of modern money theory (MMT) are right’ IMK Working Paper 132