Tuesday, 30 April 2019

The US: As good as it gets?


For all the recent concerns about the global economic slowdown a lot of the data released in the last couple of weeks supports the view that the world economy is, for the most part, enjoying a decent run. Despite the government shutdown around the turn of the year and concerns over the ongoing trade dispute with China, the US posted an annualised GDP growth rate of 3.2% in Q1. Whilst inventories contributed almost 0.7 percentage points to this figure, the GDP outcome was still significantly better than expected at the start of the year. As Mohamed El-Erian pointed out in a Tweet, “the 2s-10s US yield curve has steepened quite a bit in the last two and a half weeks” which has received far less attention than the flattening which preceded it, which was viewed in many quarters as a harbinger of recession (chart).  
Just to show that the good news is not confined purely to the US, the Q1 euro zone GDP data out this morning pointed to a rise of 0.4% q-o-q (around 1.6% on an annualised basis). Whilst this was considerably slower than US growth rates, it was again rather stronger than might have been anticipated a few weeks ago. Next week’s Q1 UK figures are also likely to come in at 0.4% q-o-q with the possibility of an upside surprise on the back of pre-Brexit inventory accumulation.

With the US economy continuing to look solid, and no sign that the cycle is about to turn down as it is set to become the longest cyclical upswing on record, equity markets continue to power ahead. That said, Q1 earnings have been rather disappointing on the whole and earnings per share on the S&P500 are currently running around 6.5% lower than Q4 levels and 9.5% below the Q3 2018 peak. Based on consensus estimates, it thus looks likely that barring a miraculous surge US earnings in 2019 may only register a small positive gain of around 2.5% following a hefty 23.7% last year (the biggest gain since 2010). Naturally, this reflects the fact that the 2018 numbers were flattered by corporate tax cuts, and some weakness was always likely given that last year’s one-off boost could not be repeated.

Doubtless you will read newspaper headlines from equity bulls suggesting that around 80% of S&P500 companies have beaten earnings expectations of late. But we should not place so much emphasis on a metric which companies use to game the system in order to flatter their earnings profile. The simple truth is that despite the strength of the economy, corporate USA is unlikely to repeat last year’s stellar numbers. That is not necessarily a bad thing: If the market consensus proves to be right, US earnings growth over the period 2018-19 will still cumulate to 27% which translates into 12.6% per annum versus an annual average of 7.4% since the turn of the millennium.

But there are some indications that the dynamic that has driven the US market over the past couple of years may be fraying at the edges. First quarter earnings at Alphabet (the company formerly known as Google) undershot relative to expectations thanks to weak revenues whilst Netflix’s forecasts for subscriber numbers also trailed estimates during Q1. Admittedly Netflix has aggressive targets for the remainder of the year and a track record of delivering, so as a result the price has held up pretty well since the start of the year. But such has been the strength of the FAANG (Facebook, Apple, Amazon, Netflix and Google) sector that any downside surprises may catch the market unawares.

The Fed, which meets tomorrow, is likely to keep rates firmly on hold for a long time to come and reiterate that it is in wait-and-see mode. Whilst this has helped drive markets higher, I can’t help wondering whether the decision in March to announce a pause may have been a tad premature particularly given the strength of activity in Q1. Call me old-fashioned but it is not the job of central banks to support the markets, which after all have had a great run over the past decade. And given the extent to which equity markets and economic fundamentals have been running out of line, a further modest monetary tightening – or at least the impression that the Fed might do so – may be enough to take some of froth out of the markets without unduly derailing the economy.

Nonetheless, the macro data are probably as good as we are going to get – this is the ultimate Goldilocks scenario, with the US economy neither too hot nor too cold. The labour market data suggests that the job machine is running smoothly and consumer confidence has recently spiked to near all-time highs. Donald Trump’s recent exhortations to the Fed to cut interest rates and engage in more QE is thus completely the wrong advice right now. But if past experience is any guide, we should enjoy the current conjuncture while we can. It may not last.

Monday, 29 April 2019

The market for central bank governors

The search for a successor to BoE Governor Carney kicked off last week, ahead of his contract expiry next January, whilst jockeying for the top job at the ECB has also got underway with Mario Draghi due to stand down in November. Naturally the press has had a field day looking at the possible candidates for both positions. But less attention has been paid to the qualities necessary to be an effective central bank governor.

Over the last 30 years there has been a tendency to appoint economists to the top job. It has not always been the case, of course. Whilst former ECB President Trichet and BoE Governor Eddie George both had academic qualifications in the subject, neither would be regarded as front-line economists. But compare them to contemporaries such as Alan Greenspan, Ben Bernanke, Janet Yellen, Wim Duisenberg, Mervyn King and Draghi it is clear that a strong economics background has been viewed as an advantage. The reason for this is simple enough: Over recent years, central banks have been given a mandate to target inflation which means that they have a much closer focus on economic issues than has historically been the case.

However, I do wonder whether the unwillingness to raise interest rates – particularly in Europe – reflects the overly cautious nature of a policy-making body in which economists hold the upper hand. It was not for nothing that President Harry Truman reputedly demanded a one-handed economist in order to eliminate their tendency to say “on the one hand … but on the other.” More seriously, since the financial crisis central banks have acquired additional responsibility to manage the stability of the financial system which means that a macroeconomic background may not be the advantage that it once was.

Perhaps the most important job of any CEO, whether of a central bank or a listed company, is institution building. Mark Carney promised to be the new broom at the BoE who would bring the bank into the 21st century, getting rid of many of the arcane practices which had become institutionalised over the years and improving diversity. I am not qualified to say whether he has succeeded in this goal but we hear good things about the working environment within the central bank. More importantly, perhaps, the BoE has taken on the regulation and supervision of around 1500 financial institutions over the past six years as the responsibilities of the central bank have evolved and the head of the Prudential Regulation Authority occupies one of the most senior jobs in the BoE.

One of those touted to succeed Carney is Andrew Bailey, head of the Financial Conduct Authority, an institution independent of the BoE which is charged with ensuring that “financial markets work well so that consumers get a fair deal.” Bailey is a former BoE official who has worked in an economics function, but crucially has a very strong background in regulation. It is an indication of the extent to which the BoE’s role has changed in recent years that Bailey is even in the running for the job.

The experience of the ECB President has been rather different since Mario Draghi took over in 2011. He is – probably rightly – credited with holding the European single currency bloc together during the Greek debt crisis by promising to do “whatever it takes,” despite opposition from representatives of other member states, notably Germany. Like the BoE, the ECB has also taken on greater responsibility for the regulation of financial institutions although unlike the BoE there is no suggestion that the potential successor to Draghi will need a background in financial regulation.

Interestingly, this paper by Prachi Mishra and Ariell Reshef makes the point that the personal characteristics and experience of central bank governors does affect financial regulation. “In particular, experience in the financial sector is associated with greater financial deregulation [whilst] experience in the United Nations and in the Bank of International Settlements is associated with less deregulation.” They go on to argue that their analysis “strengthen[s] the importance of considering the background and past work experience before appointing a governor.”

This is an important point. In 2012, when the BoE was looking for a successor to Mervyn King, the Chancellor of the Exchequer cast his net far and wide. Mark Carney got the gig because the government wanted an outsider to take over a central bank which was perceived to be too close to the institutions it was meant to regulate. Moreover, he had previous experience of running a central bank. But whilst Carney has done a good job over the past six years, I still believe it wrong to think (as the Chancellor George Osborne did during the hiring process) that filling this role is akin to finding a CEO of a multinational company, whose place can be filled by anyone from an (allegedly) small pool of international talent. They are an unelected official who holds a position of key strategic importance, enjoying unprecedented powers to influence both monetary policy and the shape of the banking system. In that sense it has never been clear to me that the interests of an outsider with no experience of UK policy issues are necessarily aligned with the UK's national interest.

Contrast this with the way the ECB process works. There are, in theory, 19 candidates for the top job amongst the central bank governors of EMU members, all of whose interests are aligned with those of the euro zone. In addition, there are another five potential candidates amongst the members of the Executive Board. Admittedly there is a lot of political horse-trading involved in the selection process, but there is no need to look for an outsider who may not necessarily be up to speed with the complexities of local issues, not to mention local politics which is increasingly a problem for central bankers (I will come back to this another time).

For the record, this is absolutely not an issue of economic nationalism – it is simply to remind those making hiring decisions that just because someone has done a similar job does not necessarily make them the best candidate for a position elsewhere. Indeed, if the evidence from the private sector is anything to go by, the continuity candidate may be the best person for the job: In the private sector, “firms relying on internal CEOs have on average higher profits than external-CEO firms”. And for anyone who doubts that the search for an external candidate will necessarily be an improvement over the local options, just ask the English Football Association about their experiences with Sven-Göran Eriksson and Fabio Capello.

Wednesday, 24 April 2019

A retrospective on macro modelling

Anyone interested in the recent history of economics, and how it has developed over the years, could do worse than take a look at the work of Beatrice Cherrier (here). One of the papers I particularly enjoyed was a review of how the Fed-MIT-Penn (FMP) model came into being over the period 1964-74, in which she and her co-author, Roger Backhouse, explained the process of constructing one of the first large scale macro models. It is fascinating to realise that whilst macroeconomic modelling is a relatively easy task these days, thanks to the revolution in computing, many of the solutions to the problems raised 50-odd years ago were truly revolutionary.

I must admit to a certain nostalgia when reading through the paper because I started my career working in the field of macro modelling and forecasting, and some of the people who broke new ground in the 1960s were still around when I was starting out in the 1980s. Moreover, the kinds of models we used were direct descendants of the Fed-MIT-Penn model. Although they have fallen out of favour in academic circles, structural models of this type are in my view still the best way of assessing whether the way we think the economy should operate is congruent with the data. They provide a richness of detail that is often lacking in the models used for policy forecasting today and in the words of Backhouse and Cherrier, such models were the “big science” projects of their day.

Robert Lucas and Thomas Sargent, both of whom went on to win the Nobel Prize for economics, began in the 1970s to chip away at the intellectual reputation of structural models based on Keynesian national income accounting identities for their “failure to derive behavioral relationships from any consistently posed dynamic optimization problems.” Such models, it was argued, contained no meaningful forward-looking expectations formation processes (true) which accounted for their dismal failure to forecast the economic events of the 1970s and 1980s. In short, structural macro models were a messy compromise between theory and data and the theoretical underpinnings of such models were insufficiently rigorous to be considered useful representations of how the economy worked.

Whilst there is some truth in this criticism, Backhouse and Cherrier remind us that prior to the 1970s “there was no linear relationship running from economic theory to empirical models of specific economies: theory and application developed together.” Keynesian economics was the dominant paradigm, and such theory as there was appeared to be an attempt to build yet more rigour around Keynes’ work of the 1930s rather than take us in any new direction. Moreover, given the complexity of the economy and the fairly rudimentary data available at the time, the models could only ever be simplified versions of reality.

Another of Lucas’s big criticisms of structural models was the application of judgement to override the model’s output via the use of constant adjustments (or add factors). Whilst I accept that overwriting the model output offends the purists, it presupposes that economic models will outperform relative to human judgement. But such an economic model has not yet been constructed. Moreover, the use of add factors reflects a certain way of thinking about modelling the data. If we think of a model as representing a simplified version of reality, it will never capture all the variability inherent in the data (I will concede this point when we can estimate equations, all of which have an R-bar squared close to unity). Therefore, the best we can hope for is that the error averages zero over history – it will never be zero at all times. 

Imagine that we are in a situation where the last historical period in our dataset shows a residual for a particular equation which is a long way from zero. This raises a question of whether the projected residual in the first period of our forecast should be zero. There is, of course, no correct answer to the question. It all boils down to the methodology employed by the forecaster – their judgement – and the trick to using add factors is to project them out into the future so that they minimise the distortions to the model-generated forecast.

But to quote Backhouse and Cherrier, “the practice Lucas condemned so harshly, became a major reason why businessmen and other clients would pay to access the forecasts provided by the FMP and other macroeconometric models … the hundreds of fudge factors added to large- scale models were precisely what clients were paying for when buying forecasts from these companies.” And just to rub it in, the economist Ray Fair ”later noted that analyses of the Wharton and Office of Business Economics (OBE) models showed that ex-ante forecasts from model builders (with fudge or add factors) were more accurate than the ex-post forecasts of the models (with actual data).

Looking back, many of the criticisms made by Lucas at al. seem unfair. Nonetheless, they had a huge impact on the way in which academic economists thought about the structure of the economy and how they went about modelling it. Many academic economists today complain about the tyranny of microfoundations, in which it is virtually impossible to get a paper published in a leading journal without linking models of the economy to them. In addition, the rational expectations hypothesis has come to dominate in the field of macro modelling, despite the fact there is little evidence suggesting this is how expectations are in fact formed.

As macro modelling has developed over the years, it has raised more questions than answers. One of the more pervasive is that, like the models they superseded, modern DSGE models have struggled to explain bubbles and crashes. In addition, their treatment of inflation leaves a lot to be desired (the degree of price stickiness assumed in new Keynesian models is not evident in the real world). Moreover, many of the approaches to modelling adopted in recent years do not allow for a sufficiently flexible trade-off between data consistency and theoretical adequacy. Whilst recognising that there are considerable limitations associated with structural models using the approach pioneered by the FMP, I continue to endorse the view of Ray Fair who wrote in 1994 that the use of structural models represents "the best way of trying to learn how the macroeconomy works."

Wednesday, 17 April 2019

Inflation beliefs

One of the biggest apparent puzzles in macroeconomic policy today is why inflation remains so low when the unemployment rate is at multi-decade lows. The evidence clearly suggests that the trade-off between inflation and unemployment is far weaker today than it used to be or, as the economics profession would have it, the Phillips curve is flatter than it once was (here). But as the academic economist Roger Farmer has pointed out, the puzzle arises “from the fact that [central banks] are looking at data through the lens of the New Keynesian (NK) model in which the connection between the unemployment rate and the inflation rate is driven by the Phillips curve.” But what if there were better ways to characterise the inflation generation process?

Originally, the simple interpretation of the Phillips curve suggested that policymakers could use this trade-off as a tool of demand management – targeting lower (higher) unemployment meant tolerating higher (lower) inflation. However, much of the literature that emerged in the late-1960s/early-1970s suggested that demand management policies were unable to impact on unemployment in the long-run and that it was thus not possible to control the economy in this way. The reason is straightforward – efforts by governments (or central banks) to stimulate the economy might lead in the short-run to higher inflation, but repeated attempts to pull the same trick would result in a response by workers to push for higher wages which in turn would choke off labour demand and raise the unemployment rate. In summary, government attempts to drive the unemployment rate lower would fail as workers’ inflation expectations adjusted. One consequence of this is that the absence of any such trade-off implies the Phillips curve is vertical in the longer-term (see chart).

Another standard assumption of NK models, which are heavily used by central banks, is that inflation expectations are formed by a rational expectations process. This implies some very strict assumptions about the information available to individuals and their ability to process it. For example, they are assumed to know in detail how the economy works, which in modelling terms means they know the correct structural form of the model and the value of all the parameters. Furthermore they are assumed to know the distribution of shocks impacting on the economic environment. Whilst this makes the models intellectually tractable, it does not accord with the way in which people think about the real world.

But some subtle differences to the standard model can result in significant changes to the outcomes, which we can illustrate with regard to some recent interesting work by Roger Farmer. In a standard NK model the crucial relationship is that inflation is a function of expectations and the output gap, and produces the expected result that the long-run Phillips curve is indeed vertical. But Farmer postulates a model in which the standard Phillips curve is replaced by a ‘belief’ function in which nominal output in the current period depends only on what happened in the previous period (known as a Martingale process). Without going through the full details (interested readers are referred to the paper), the structure of this model implies that policies which affect aggregate demand do indeed have permanent long-run effects on the output gap and the unemployment rate, which is in contrast to the standard NK model. Moreover, Farmer’s empirical analysis suggests that the results from models using belief functions fit the data better than the results derived from the standard model.

The more we think about it, the more this structure makes sense. Indeed, as an expectations formation process, it is reasonable to assume that what happened in the recent past is a good indicator of what will happen in the near future (hence a ‘belief’ function). Moreover, since in this model the target of interest (nominal GDP) is comprised of real GDP and prices, consumers are initially unable to distinguish between real and nominal effects, even though any shocks which affect  them may have very different causes. In an extreme case where inflation slows (accelerates) but is exactly offset by a pickup (slowdown) in real growth, consumers do not adjust their expectations at all. In the real world, where people are often unable to distinguish between real and price effects in the short-term (money illusion), this appears intuitively reasonable.

All this might seem rather arcane but the object of the exercise is to demonstrate that there is only a “puzzle” regarding unemployment and inflation if we accept the idea of a Phillips curve. One of the characteristics of the NK model is that it will converge to a steady state, no matter from where it starts. Thus lower unemployment will lead to a short-term pickup in wage inflation. Farmer’s model does not converge in this way – the final solution depends very much on the starting conditions. As Farmer put it, “beliefs select the equilibrium that prevails in the long-run” – it is not a predetermined economic condition. What this implies is that central bankers may be wasting their time waiting for the economy to generate a pickup in inflation. It will only happen if consumers believe it will – and for the moment at least, they show no signs of wanting to drive inflation higher.

Saturday, 13 April 2019

The cost of taking back control


Although the prospect of a no-deal Brexit has been postponed for now, companies incurred significant costs in preparing for an outcome that never materialised. Perhaps these preparations will eventually pay off in the event that a “hard” Brexit does occur, but companies will be hoping that they never have to find out. Their “prepare for the worst and hope for the best” strategy will simply have to be written off as an unanticipated business cost.

If we think of taxation as an extra expense levied on business activity, we can treat the costs of Brexit preparation as an uncertainty tax. The requirement to prepare for Brexit came about as the result of a decision taken by government in much the same way as they decide how to levy taxes – the only difference being that the state did not see any of the revenues (and before anyone reminds me that the referendum result was driven by the view of the electorate, the decision to enact it was taken by government). Analysis conducted by Bloomberg, which looked at the costs incurred by six large companies indicates that they spent a total of £348 million in Brexit-proofing their businesses (chart). For the two large banks in the sample, which accounted for over 80% of these outlays, their preparations cost 0.5% of revenue. It is not a huge amount in the grand scheme of things but it represents a transfer of resources away from productive activity to something with no apparent end-use. Furthermore, the amount spent by HSBC and RBS was equivalent to the annual salaries of 4,000 banking staff.

What is particularly troubling is that these are costs that could have been avoided, or at the very least minimised. The government’s inability to give clear guidance as to what Brexit entailed meant that companies had to figure out the necessary steps for themselves in order to ensure business continuity. Whilst all companies have to prepare for contingencies, the government’s ill-advised decision to leave the EU single market and customs union imposed a cost on business that was wholly avoidable. Ironically, the 2017 Conservative manifesto contained the following pledges: “we need government to make Britain the best place in the world to set up and run modern businesses, bringing the jobs of the future to our country; but we also need government to create the right regulatory frameworks ... We will set rules for businesses that inspire the confidence of workers and investors alike.

The government has also committed £1.5 bn of public money to plan for a no-deal Brexit and the 6,000 civil servants who have been planning for a no-deal Brexit have been stood down. In the bigger picture £1.5 bn is peanuts but back of the envelope calculations suggest that this is equivalent to the annual cost of employing 30,000 nurses (around 10% of the total currently employed) and about the same number of police officers (around 25% of current total figures). Or to put it another way, the UK could have guaranteed the funding of 6,000 police officers for five years at a time when there is concern that police numbers are too low.

It is thus evident that Brexit is not just some arcane parliamentary debate – even the prospect of it entails real resource costs, as funds have to be diverted from areas where they would surely provide a higher social benefit. Fiscal trade-offs always involve an element of guns or butter but it is hard to disagree with the claim by Labour MP Hilary Benn, chair of the parliamentary Brexit committee, that this was a “costly price” to pay for the prime minister’s insistence of keeping no-deal on the table.

And we are not out of the woods yet. The fact that the can has merely been kicked down the road may have avoided the cliff-edge but does nothing to improve companies’ certainty with regard to the future. Business fixed investment volumes have declined for the last four quarters and it is still only slightly higher than prior to the recession of 2008-09. Output growth thus appears to have been driven by an increase in labour input, rather than capital, and whilst this has driven the unemployment rate to its lowest since the mid-1970s, the lack of investment is one of the reasons behind the UK’s poor productivity performance. To the extent that productivity is one of the key drivers of living standards, this sustained weakness in investment acts as a warning signal that Brexit-related uncertainty continues to have wider economic ramifications.

The day after Nigel Farage launched his new political party aiming to stand in the European elections on an anti-EU platform (the irony), I do continue to wonder as an economist what it is he is aiming for. As Chancellor Philip Hammond put it in 2016 “people did not vote on June 23rd to become poorer or less secure.” Yet the economics indicates that is exactly what Farage’s policies will entail. But then it’s never been about the economics – it’s all about taking back control.  As was so spectacularly demonstrated in Brussels on Wednesday evening when the EU27 took control of the Brexit process.