Wednesday, 30 November 2022

The DSGE paradigm: Do Stop Generating Errors

From RBC to DSGE

The recent passing of Ed Prescott, the 2004 Nobel Laureate in economics, was a cause for sadness across the economics profession. Prescott was universally recognised as a revolutionary thinker in the field of macroeconomics and one of his great innovations (along with fellow Laureate Finn Kydland) was the introduction of so-called Real Business Cycle (RBC) models. In simple terms, these models postulate that business cycle fluctuations arise as a result of labour supply decisions in response to stochastic shocks. One of the consequences of this paradigm is that business cycles are optimal responses to productivity shocks and that interventions to offset such shocks are harmful because they cause the economy to deviate from its long-run optimal path.

The first attempt to produce an economic model based on these principles was Prescott’s 1986 paper ‘Theory Ahead of Business Cycle Measurement’ which was very much based on calibrated responses rather than one which used statistical techniques to fit the data. It was also a model which assumed a world in which there were no distortions. Unsurprisingly, Keynesian economists did not take the RBC conclusions lying down. They argued that the economy was characterised by frictions such as nominal rigidities, the existence of monopoly power and information asymmetries which can result in involuntary unemployment, thus opening up a role for governments to smooth the cycle. In response, the so-called New Keynesians devised a model paradigm which required the imposition of a number of restrictive assumptions in order to approximate the world as they saw it.

Thus did the literature on New Keynesian Dynamic Stochastic General Equilibrium (DSGE) models come into being: Dynamic because they operate over very long (infinite) horizons; Stochastic because they deal with random shocks and General Equilibrium because they are built up from microfoundations. Such models now dominate much of the academic thinking in the modelling and policy literature. But they are mathematically complex, opaque and founded on a series of assumptions that calls into question whether they have anything useful to contribute to the future of macroeconomics[1].

What’s not to like? Quite a lot as it happens!

In the words of Olivier Blanchard, “there are many reasons to dislike current DSGE models” particularly because of the apparently arbitrary nature of the assumptions on which they are based. For example, aggregate demand is based on infinitely lived households which are assumed to have perfect foresight. Show me one of those and I will give you some hen’s teeth. Furthermore, the inflation equation is based on a forward looking equation that does not take any account of inflation persistence. But perhaps the most contestable features of DGSE models is their slavish adherence to microfoundations. These attempt to embed economic behaviour patterns that are invariant to a particular state of the world. This allows macroeconomics to escape from the charge posed by the Lucas critique that the parameters of any model change as circumstances change – a criticism of the models in operation in the 1970s, and which was perceived to be one of the reasons why they performed so badly in predicting the recessions of the time.

There is a lot wrong with this way of thinking. For one thing, the microfoundations are based on the behaviour of representative agents. In other words, they impose a theory of how individual firms and households act and assume that we can scale this up to the wider economy. As one who grew up using models based on aggregate data, it has always struck me as odd that we should discard much of the richness inherent in the observational evidence of macro data. An interesting theoretical paper published in 2020 makes the more subtle point that for the representative agent to mimic the preference structure of the population requires the imposition of extreme restrictions on the utility function used to describe household behaviour. The supreme irony of this is that the DSGE revolution was able to capture the intellectual high ground because the structural modelling paradigm that it replaced was unable to counter the criticism levelled in Chris Sims’s classic 1980 paper that such models relied on “incredible” identifying assumptions.

A further thought is that we have little evidence that the utility functions of representative agents are invariant over time, as modern macro theory assumes. But whilst there is no doubt that the research underpinning the macro revolution in the late-1970s and early-1980s – including the influential work of Lucas – is intellectually persuasive, the evidence of this year alone, in which inflation spiked to 40-year highs unforeseen by most models in 2021, does not persuade me that the DSGE revolution has significantly enhanced the thinking in modern macro.

By this point you have probably gathered that I am highly sceptical of much of the work conducted in modern macro modelling in the last 40 or so years. This is not to deny that it is intellectually fascinating and I am more than happy to play around with DSGE models. But as Anton Korinek points out in this fascinating essay, “DSGE models aim to quantitatively describe the macroeconomy in an engineering-like fashion.” They fall victim to the “mathiness” in economics, of which Paul Romer was so scathing.

And their forecasting performance is poor

We might be more accepting of the DSGE paradigm if it produced significantly better forecasting results than what went before. The events of the past 15 years suggest that this is far from the case and it is now generally acknowledged that the out-of-sample forecasting performance of DSGE models is very poor. If this does not render them useless as a forecasting tool, it suggests that they are no better than the structural models which the academic community has spent forty years trying to knock down. Proponents will argue that this is not what they are designed to do. Rather they are designed to understand how the economy is constructed around the deep-seated parameters underpinning household and corporate decision-making which allows for policy evaluation.

This debate was brought into sharp focus recently following the publication of a fascinating paper on the properties of DSGE models which is less concerned about whether they represent good economics but whether they represent good models in a statistical sense. The answer, according to the authors, is that they do not. The paper can get quite dense in places but one of the things it does is to examine how well it can fit nonsense data. By randomly swapping the series around and feeding them into the DSGE model, “much of the time we get a model which predicts the [nonsense] data better than the model predicts the [actual] data.” They draw the damning conclusion that “even if one disdains forecasting as an end in itself, it is hard to see how this is at all compatible with a model capturing something – anything – essential about the structure of the economy.”

Last word

As one who for many years has used models for forecasting purposes that have been sniffily dismissed by the academic community, it is hard to avoid a sense of schadenfreude. Blanchard offers us a way out of this impasse, arguing that theoretical models of the economy have a role to play in “clarifying theoretical issues within a general equilibrium setting ... In short, [they] should facilitate the debate among macro theorists.” By contrast, policy models of the type with which I am most comfortable, “should fit the main characteristics of the data” and be used for forecasting and policy analysis. 

There is some merit in this argument. By all means continue to tinker with DSGE models to see what kinds of insight they can generate but do not let them anywhere near the real world until their forecast performance substantially improves. In the words of statistician George Box, “all models are wrong but some are useful”. And some are DSGE models.


[1] For an excellent introduction to many of the issues in modern macro, check out this free online textbook ‘Advanced Macroeconomics: An Easy Guide’ By Filipe Campante, Federico Sturzenegger and Andrés Velasco

Sunday, 20 November 2022

Think outside the fiscal box

Fiscal issues continue to dominate the UK economic agenda with last week’s budget prompting huge debate as it attempted to repair some of the damage done by the Truss-Kwarteng car-crash mini budget in September. Not all of the deterioration in the fiscal position can be attributed to Truss’s disastrously brief tenure. Although the latest plan suggests that the government will borrow an additional £306bn by fiscal year 2026-27 compared to the March budget, a large portion of this reflects the measures designed to shield households from the full impact of the rise in energy bills (£37.6bn over the next two years). In this sense, part of the deterioration reflects an attempt to do the right thing by taxpayers. 

Higher debt interest payments account for roughly two-thirds of the increase in borrowing. This is only partly related to the market reaction to the disastrous mini budget. It is much more the result of higher interest rates in response to inflation at four-decade highs. This in turn is the result of both domestic monetary policy decisions, resulting from Bank of England actions to raise interest rates, and global actions as central banks around the world scramble to tighten policy. Nonetheless, it is incongruous that the UK government has implemented a tighter fiscal stance which is partly the result of tighter monetary conditions. Any fears that fiscal dominance is a theme in the UK policy environment can be set aside for now. 

The government did attempt to offset some of the fiscal damage resulting from these two factors by introducing a series of measures to close the gap. Whilst borrowing is higher than projected nine months ago, it is less than would be the case had not the government also implemented measures that by fiscal 2027-28 imply a fiscal tightening of £54.9bn, which translates into a fiscal contraction equivalent to 0.5% of GDP for each of the next five years. Spending cuts account for £30bn of the measures, although these are not scheduled to kick in until after the next election, whilst the remainder is derived from higher tax revenues (chart). 

There is an important political dimension to the latest budget plans. The fact that spending cuts only take effect in two years’ time in effect mean that the government is setting an elephant trap for the next government. If Labour win the election, which is scheduled no later than January 2025 and which seems likely on the basis of recent polling evidence, they will have to decide whether to go along with the current government’s fiscal plans or take a risk with new plans of their own, having been a major critic of the austerity policies in place since 2010. If the Tories win … well that’s a bridge they will cross when they come to it.

What is the goal of fiscal policy?

All of this leads to the real question: What exactly is the goal of fiscal policy? In recent weeks the debate has been all about the ideological split within the Conservative Party with supporters of a big state pitted against those advocating a small state, and there is a sense that the latter group is very much on the back foot. Last week’s efforts were clearly aimed at getting the markets back onside following the September shenanigans. Whilst it is important not to alienate one’s creditor, there was a nagging sense that too much emphasis was placed on placating markets. The budget balance cannot simply be an end in itself.

This raises the question of how much fiscal space the UK has and how well it stacks up against comparable economies. Conventionally, the degree of fiscal space is determined by the level of debt which markets believe an economy can carry without jeopardising the overall economic and fiscal position. It is true that UK debt gross debt levels are high – 97% of GDP at the end of fiscal 2021-22, rising to a peak of around 107% by early 2024 on the basis of the OBR’s forecast. Looking across Europe, however, the likes of Spain, France and Italy all have higher debt ratios (118%, 113% and 150% respectively) and none of them issue debt in a currency whose issuance they control. There is not the same debate about the need to wear the fiscal hairshirt in these countries, despite a worse fiscal position. Fear of the markets is currently the dominant theme of British fiscal policy rather than a coherent attempt to use policy in a more strategic sense.

What should policy be doing?

Successive governments have failed to have a grown-up debate about appropriate levels of tax. The demographic dividend that allowed the Thatcherite Tory party to cut taxes has ceased to be a tailwind and is now beginning to act as a fiscal headwind. Whilst it has long been said that the British electorate wants Scandinavian-level public services whist paying US-level taxes, it is currently getting neither (almost the opposite, in fact). Boris Johnson’s levelling-up agenda has been consigned to the dustbin and redistribution is still a dirty word in many areas of government.

Whilst Truss’s highly regressive fiscal plans have been found wanting, we are not yet at the stage where we are talking about what different things we might want fiscal policy to do. If we are serious about a transition to a green economy, for example, a big public investment in this area might pay dividends. I have also long advocated investment in making the economy fit to cope with climate change – again, government can play a role. However, we have missed our chance to use the lowest interest rates in history to fund such investment. But even if the government does not want to do any of these things, taxes will have to rise in order to provide the levels of public services that people have come to expect. In order to do so, we need to start thinking outside the box on taxes.

Governments over the years have tinkered with income and corporate taxes as the main areas of focus. I have in the past suggested that wealth taxes are an area that should at least be looked at more closely. Tax Justice UK, a lobby group which advocates for a “sustainable, fair and effective tax system” has suggested a number of tax loopholes which could be closed and in the process raise substantial revenue. Amongst the measures it proposes are equalizing capital gains with income tax rates, raising up to £14 bn a year (a measure which was originally championed by the Office of Tax Simplification). Extending national insurance to investment income could raise a further £8.6 bn a year. In all, it has identified measures which could generate an additional annual £37bn of income.

The bottom line is that there is no obvious need to squeeze spending quite as hard as the government outlined last week. Austerity was a bad policy in 2010. It is an even worse response today, with public services cut to the bone in many areas. Whilst there are many who argue that this is all to do with Brexit, it is not. We are at a point where demographics have collided with a low productivity economy to leave the government scrambling to provide a fix. Brexit is not helping, of course, since it will act to depress growth in the longer term and widen the budget gap. But in order to tackle the problems, we need to get away from the sterile discussion about further raising income or corporate taxes. It is about the need for a fundamental reform of the tax system.