I have to confess that I have long been torn between the
intellectual pursuit of academic economics and the uselessness of much of the
output. Part of the appeal of economic theory is that it attempts to address
problems in a rigorous manner. Of course, that is also its Achilles Heel: the
intellectual underpinnings of much that passes for current state of the art
thinking are simply bogus. For that reason papers such as the one by Paul
Romer, chief economist at the World Bank, entitled ‘The Trouble with
Macroeconomics’ (here),
always strike a chord. In many ways, this is a subversive read for macro
economists and makes a number of serious, but in my view substantiated,
allegations regarding the state of economics today.
Romer’s key thesis is that the “identification problem” in
economics has essentially taken us round in a circle back to where we started
in the 1970s. I hope readers will forgive a little digression at this point so
that we can more easily understand the nature of the problem which Romer sets
out. The identification problem requires, as Chris Sims noted in a famous 1980
paper, that we must be able to identify “observationally distinct patterns of
behaviour” for a given model. This is both a philosophical and empirical
argument. Philosophically, it requires us to specify very carefully how our
economic system works. In an empirical sense, it means we must construct models
in which unique values for each of the model parameters can be derived from
other variables in the system. This in turn allows us to clearly identify how
economic linkages operate.
As it happens most empirical macro models in use in 1980
were over-identified: It was possible to explain each variable in the model by various
different combinations of other variables. Consequently, we were unable to determine
precisely how the macroeconomy worked. In his 1980 paper entitled ‘Macroeconomics
and Reality’ (a title which, when I first read the paper, seemed to be most
inappropriate) Sims noted that such models could only be made to work by
applying “incredible” identifying restrictions.
This identification problem is the key to understanding Romer’s
critique of much modern macroeconomic theory. In his view, by trying to get
away from imposing such “incredible” restrictions, macroeconomists have ended
up devising models which themselves are increasingly divorced from reality. Romer
starts by taking direct aim at Real Business Cycle (RBC) models which were a
direct response to many of the criticisms of the over-identified macro models of
the 1970s. He argues that they make a hugely simplifying assumption that cyclical
fluctuations in output are solely the result of shocks. The question Romer
poses is "what are these imaginary shocks?" Is there really no role
for monetary policy, as much of thinking in this field suggests? If that is
true, we should all pack up and go home – and the Fed, ECB, BoE et al should
abandon attempts to stabilise the economic cycle.
He then aims his blunderbuss at the DSGE models which followed
from this, arguing in effect that they are a post-truth way of looking at the
world because they rely less on data and more on a series of assumptions about
how the world works. (I posted on this topic here).
Indeed, such is Romer’s apparent contempt for some of this analysis that he
states “the noncommittal relationship
with the truth revealed by these methodological evasions [and] dismissal of
fact goes so far beyond post-modern irony that it deserves its own label. I
suggest ‘post-real’.”
Even worse, in his view, is that many of the proponents of modern
macroeconomics have tended to band together and reinforced each other’s views
rather than challenging them. This unwillingness to challenge belief systems
has, in Romer’s opinion, promoted a stagnant culture which has slowed the
advancement of new ideas in economics.
Faced with such a nihilistic view of economics, you may wonder
what is the point of it all? I share Romer’s criticisms of many of the ideas which
have found their way into the economic mainstream. But I also adhere to the
George Box school of thought that whilst all models are wrong, some are useful.
There is thus nothing inherently wrong with the idea of going in the direction
of RBC or DSGE models – it is just that they have captured the high
intellectual ground and have proven difficult to shift. And like it or
not, many of the competing theories have not proven up to scratch either.
What is also interesting is that whilst many economists
passionately advocate a particular school of thought, few people of my
acquaintance argue in public in such terms. So either this debate is a
particularly American academic thing or it is confined to those pushing hard to
get their material into the journals (and failing). And finally, I have long
argued that the financial crisis will act as an efficient way of winnowing out
many of the worst ideas in macroeconomics. Just as the Great Depression of the 1930s
produced the ideas of Keynes and his acolytes, so the current crisis in the western
word may yet lead to a more fruitful approach to many economic issues. So chin
up, Mr Romer. The darkest hour comes just before dawn.
Wednesday, 23 November 2016
Sunday, 20 November 2016
Brexit: A Bayesian view
The Reverend Thomas Bayes was an English clergyman who lived
in the first half of the eighteen century, and who also happened to be a
mathematician. He gave his name to a branch of statistics which has emerged
from relative obscurity in recent years, and which helps better understand the
world around us. The insight of Bayesian statistics is that it characterises
probability as uncertainty, which represents a belief about a particular
outcome. The only real thing is the data and as a result some outcomes are more
believable than others based on the data and their prior beliefs.
So-called classical statistics, which is most people’s introduction to the subject, relies on the insight that probability represents a fixed long-run relative frequency in which the likelihood of an event emerges as a ratio from an infinitely large sample size. In other words, the more observations we have, the more likely it is that the most frequently observed outcome represents the true mean of a given distribution.
To illustrate how these two schools of thought differ, consider the case of horse racing. Two horses – let’s call them True Blue and Knackers Yard – have raced against each other 15 times. True Blue has beaten Knackers Yard on 9 occasions. A classical statistician would thus assign a probability of 60% to the likelihood that True Blue wins (9/15), implying a 40% chance that Knackers Yard will win. But we have additional information that on 5 of the 7 occasions when Knackers Yard has won, the weather has been wet whilst True Blue won two wet races. The question of interest here is what are the odds that Knackers Yard will win knowing that the weather ahead of the sixteenth race is wet? To do this, we can combine two pieces of information: the head-to-head performance of the two horses, and their performance dependent on weather conditions.
In order to do this, we make use of Bayes Theorem which is written thus:
So-called classical statistics, which is most people’s introduction to the subject, relies on the insight that probability represents a fixed long-run relative frequency in which the likelihood of an event emerges as a ratio from an infinitely large sample size. In other words, the more observations we have, the more likely it is that the most frequently observed outcome represents the true mean of a given distribution.
To illustrate how these two schools of thought differ, consider the case of horse racing. Two horses – let’s call them True Blue and Knackers Yard – have raced against each other 15 times. True Blue has beaten Knackers Yard on 9 occasions. A classical statistician would thus assign a probability of 60% to the likelihood that True Blue wins (9/15), implying a 40% chance that Knackers Yard will win. But we have additional information that on 5 of the 7 occasions when Knackers Yard has won, the weather has been wet whilst True Blue won two wet races. The question of interest here is what are the odds that Knackers Yard will win knowing that the weather ahead of the sixteenth race is wet? To do this, we can combine two pieces of information: the head-to-head performance of the two horses, and their performance dependent on weather conditions.
In order to do this, we make use of Bayes Theorem which is written thus:
P(A | B) = P(B | A). P(A)
P(B)
P(A|B) is the likelihood that event A occurs conditional on
event B. In this case, we want to know the probability that Knackers Yard wins
conditional on the fact it is raining. P(B|A) is the probability of the
evidence turning up, given the outcome. In this case, we want to know the
likelihood that it is raining given that Knackers Yard wins. Since there were 7
rainy days in total and Knackers Yard won on five occasions, the answer is 5/7
or 83.3%. P(A) is the prior probability that the event occurs given no additional
evidence. In this case, the probability that Knackers Yard wins is 40% (it has
won 6 out of 15 races). P(B) is the probability of the evidence arising,
without regard for the outcome – in this case, the probability of rain
irrespective of which horse won. Since we know there were 7 rainy days out of
15 races, P(B)=7/15 = 46.7%. Plugging all this information into the formula, we
can calculate that P(A|B)=71.4%.
Now all this might appear to be a bit geeky but it is an interesting way to look at the problem of how the UK economy is likely to perform given that Brexit happens. Our variable of interest is thus P(A|B): the UK’s economic growth performance conditional on Brexit; P(B) is the likelihood of Brexit and assuming (as the government seems to suggest) that it is set in stone, we set it to a value of 1. Moreover, assuming that Brexit will happen regardless of the economic cost (i.e. ministers are not overly concerned about accepting a hard Brexit) then P(B|A) is also close to unity.
In effect, the Bayesian statistician might suggest that P(Growth│Brexit)=P(Growth). Since the only concrete information we have on economic performance is past performance, it is easy to make the case from a Bayesian perspective that the UK's future growth prospects can be extrapolated from past evidence. Those pro-Brexiteers who say that UK’s post-Brexit performance will not be damaged by leaving the UK may unwittingly have statistical theory on their side. But one of the key insights of Bayesian statistics is that we change our prior beliefs as new information becomes available. If growth slows over the next year or so, then other things being equal, it would be rational to reduce our assessment of post-Brexit growth prospects.
Incidentally, a joke doing the rounds of the statistics community at present suggests that although Bayes first published the theorem which bears his name, it was the French mathematician Laplace who developed the mathematics underpinning this branch of statistics. As a result, Brexit may present a good opportunity to give due credit to the Frenchman by naming it Laplacian statistics. It’s enough to make arch-Bayesian Nigel Farage choke on his croissant.
Now all this might appear to be a bit geeky but it is an interesting way to look at the problem of how the UK economy is likely to perform given that Brexit happens. Our variable of interest is thus P(A|B): the UK’s economic growth performance conditional on Brexit; P(B) is the likelihood of Brexit and assuming (as the government seems to suggest) that it is set in stone, we set it to a value of 1. Moreover, assuming that Brexit will happen regardless of the economic cost (i.e. ministers are not overly concerned about accepting a hard Brexit) then P(B|A) is also close to unity.
In effect, the Bayesian statistician might suggest that P(Growth│Brexit)=P(Growth). Since the only concrete information we have on economic performance is past performance, it is easy to make the case from a Bayesian perspective that the UK's future growth prospects can be extrapolated from past evidence. Those pro-Brexiteers who say that UK’s post-Brexit performance will not be damaged by leaving the UK may unwittingly have statistical theory on their side. But one of the key insights of Bayesian statistics is that we change our prior beliefs as new information becomes available. If growth slows over the next year or so, then other things being equal, it would be rational to reduce our assessment of post-Brexit growth prospects.
Incidentally, a joke doing the rounds of the statistics community at present suggests that although Bayes first published the theorem which bears his name, it was the French mathematician Laplace who developed the mathematics underpinning this branch of statistics. As a result, Brexit may present a good opportunity to give due credit to the Frenchman by naming it Laplacian statistics. It’s enough to make arch-Bayesian Nigel Farage choke on his croissant.
Wednesday, 16 November 2016
Boiled frogs and QE
For a long time central bankers told us that quantitative
easing was the best thing since sliced bread. It would, so the conventional
wisdom went, allow for a potentially limitless expansion of the central bank
balance sheet which would flood the economy with liquidity and, at some point, eventually
result in a recovery in demand.
Those who have been reading my material over the years will know that I have never been fully convinced of the merits of QE. Back in 2009, I pointed out that using QE to stimulate domestic recovery would be hampered by the weakness of the banking sector. I also suggested that “it is unclear whether a policy which acts to improve credit supply will help to stimulate activity when demand for credit remains limited.” In response to such criticisms, the BoE later held an impromptu session to explain to financial sector economists that the main channel through which QE worked was via the wealth effect. In this way, BoE purchases would drive down yields and force bond holders to switch into other assets. This in turn would boost household wealth and help support an economic upturn. In fairness, the BoE was correct in its assessment that investors would be forced to switch into higher yielding assets – the problem was (and is) that it is financial investors who have benefited rather than households.
It is this kind of thinking which has prompted much of the recent criticism of central bank policy, particularly by politicians. But as BoE Governor Carney noted yesterday in parliamentary testimony “an excessive focus on monetary policy in many respects is a massive blame deflection exercise.” He is certainly right on that, as those of us who believe there is an expanded role for fiscal policy in the current conjuncture would attest. However, the BoE should not be allowed to get off scot-free. Some five years ago I recall having a conversation with one BoE official who, in response to my question of why QE should be expanded given that its marginal impact had cleared waned, replied in effect that “more QE does no harm, so it cannot hurt to do too much rather than too little.”
Being charitable, I guess that no policymakers thought that monetary policy would have to remain in post-crisis expansionary mode as long as it subsequently has done. And it probably seemed reasonable to central bankers in 2011 that a further dose of bond purchases would probably not do much harm. After all, there were not that many suggestions at the time that QE was overly harmful. However, I did point out as long ago as 2009 that “the impact of quantitative easing in lowering bond yields will pose real problems for pension funds.” We might have been able to wear that for a year or two, but few if any would have expected that both the BoE and ECB would still be buying assets in 2016 which in part suggests that it is the duration of the monetary easing phase, rather than the easing per se, which is the problem. Indeed, as Carney’s quote suggests, it is the government’s failure to step in to provide additional policy support which has thrown the onus onto central banks.
One of the great ironies of QE is that rather than making life easier for the banking system by providing it with a huge liquidity injection, things have got a lot tougher. Action to cut short rates to zero, or into negative territory, has increased the cost to banks of holding excess reserves whilst the QE policy has flattened the yield curve, which in turn has reduced the spread which banks need in order to make money. In many ways, the side effects of QE are akin to the frog-boiling syndrome. If you put a frog in a pan of boiling water it will immediately jump out, but if you put it in a pan of cold water and gradually turn up the heat, it will not realise that it is being boiled alive. Banks in particular are now waking up to the prospect of being boiled alive, and the ECB may even turn up the heat still further if it announces an extension of its QE programme in December.
Some respite may be afforded by the recent Trump-induced rise in bond yields, which if sustained could alleviate some of the margin pressure. But we are all now increasingly alert to the dangers of relying on more QE. This is not to say that it should necessarily be reversed but without more thought to the mix between monetary and fiscal policy, electorates in other countries might be tempted to follow the example set by the US and UK, and jump right out of the pan.
Those who have been reading my material over the years will know that I have never been fully convinced of the merits of QE. Back in 2009, I pointed out that using QE to stimulate domestic recovery would be hampered by the weakness of the banking sector. I also suggested that “it is unclear whether a policy which acts to improve credit supply will help to stimulate activity when demand for credit remains limited.” In response to such criticisms, the BoE later held an impromptu session to explain to financial sector economists that the main channel through which QE worked was via the wealth effect. In this way, BoE purchases would drive down yields and force bond holders to switch into other assets. This in turn would boost household wealth and help support an economic upturn. In fairness, the BoE was correct in its assessment that investors would be forced to switch into higher yielding assets – the problem was (and is) that it is financial investors who have benefited rather than households.
It is this kind of thinking which has prompted much of the recent criticism of central bank policy, particularly by politicians. But as BoE Governor Carney noted yesterday in parliamentary testimony “an excessive focus on monetary policy in many respects is a massive blame deflection exercise.” He is certainly right on that, as those of us who believe there is an expanded role for fiscal policy in the current conjuncture would attest. However, the BoE should not be allowed to get off scot-free. Some five years ago I recall having a conversation with one BoE official who, in response to my question of why QE should be expanded given that its marginal impact had cleared waned, replied in effect that “more QE does no harm, so it cannot hurt to do too much rather than too little.”
Being charitable, I guess that no policymakers thought that monetary policy would have to remain in post-crisis expansionary mode as long as it subsequently has done. And it probably seemed reasonable to central bankers in 2011 that a further dose of bond purchases would probably not do much harm. After all, there were not that many suggestions at the time that QE was overly harmful. However, I did point out as long ago as 2009 that “the impact of quantitative easing in lowering bond yields will pose real problems for pension funds.” We might have been able to wear that for a year or two, but few if any would have expected that both the BoE and ECB would still be buying assets in 2016 which in part suggests that it is the duration of the monetary easing phase, rather than the easing per se, which is the problem. Indeed, as Carney’s quote suggests, it is the government’s failure to step in to provide additional policy support which has thrown the onus onto central banks.
One of the great ironies of QE is that rather than making life easier for the banking system by providing it with a huge liquidity injection, things have got a lot tougher. Action to cut short rates to zero, or into negative territory, has increased the cost to banks of holding excess reserves whilst the QE policy has flattened the yield curve, which in turn has reduced the spread which banks need in order to make money. In many ways, the side effects of QE are akin to the frog-boiling syndrome. If you put a frog in a pan of boiling water it will immediately jump out, but if you put it in a pan of cold water and gradually turn up the heat, it will not realise that it is being boiled alive. Banks in particular are now waking up to the prospect of being boiled alive, and the ECB may even turn up the heat still further if it announces an extension of its QE programme in December.
Some respite may be afforded by the recent Trump-induced rise in bond yields, which if sustained could alleviate some of the margin pressure. But we are all now increasingly alert to the dangers of relying on more QE. This is not to say that it should necessarily be reversed but without more thought to the mix between monetary and fiscal policy, electorates in other countries might be tempted to follow the example set by the US and UK, and jump right out of the pan.
Saturday, 12 November 2016
The dawn of fiscal reality?
The week in markets ended on a volatile note as they began to digest the full economic implications of President Donald Trump. Emerging markets took the full brunt as it dawned upon them that the protectionist rhetoric which passed for economic debate during the election campaign was now a step closer to being realised. But bond markets also sold off sharply. The yield on 10-year Treasuries has risen by 30 basis points since Tuesday, dragging other industrialised markets in their wake. This in turn was triggered by fears of higher inflation stoked by a significant expansion of US fiscal policy.
According to analysis by the Tax Policy Center (here), Trump’s tax proposals would boost growth but cut Federal revenues. They include
reductions in marginal tax rates; increases in standard deduction amounts;
lower personal exemptions; caps on itemised deductions and allowing business to
expense investment rather than depreciating it over time, though businesses
doing so would not be allowed to deduct interest expenses.
A simple static calculation of the costs suggests this would cut revenues by $6.2 trillion over the first decade. But a looser fiscal stance would boost output, with simulations indicating an increase in output of between 0.4 and 3.6 percent in 2017, 0.2 and 2.3 percent in 2018, and smaller amounts in later years. The analysis indicates that over the first eight years, higher activity partially offsets the static revenue losses. But in the longer-term higher interest rates resulting from bigger budget deficits begin to crimp investment, thus leading to slower GDP growth and even higher revenue losses. All told, the plan would increase the debt-to-GDP ratio by 25.4% over a ten-year horizon and by 55.5% by 2036. Faced with these kinds of numbers, it is hardly surprising that bond markets are worried.
But just as a Trump presidency raises fears about the long-term prospects for free world trade, so it could also mark a pivotal moment in the mix between fiscal and monetary policy. Although the attacks by politicians on the independence of central banks in recent months have overstepped the mark, both the Brexit and US presidential votes suggest that electorates are fed up with what they are getting from governments. Indeed, households in all industrialised economies are not paying any less tax but they are receiving less back from the state as outlays are cut. With the UK government due to present its Autumn Statement on 23 November, the expectations are that it will adopt a much less aggressive approach to eliminating the public deficit than we have seen over the last six years.
Some of us would say that it is about time governments recognised that fiscal policy has a role to play in helping to get the economy back on its feet. It is certainly one of the key lessons of the 1930s, when the policy of monetary orthodoxy in the wake of the crash of 1929 contributed to the severity of the Great Depression. My own efforts to get this message across have fallen on deaf ears in recent years, with continental European economists particularly hostile to the view. But in a paper in early 2015, I pointed out that IMF research indicated that fiscal tightening could – under certain circumstances – prove to be counterproductive. Everything depends on how high we believe the fiscal multiplier to be. In simple terms, the multiplier measures the proportional change in economic output for a given change in the fiscal stance. Thus, if a fiscal tightening (expansion) of 1% of GDP produces a reduction (increase) of less than 1% in GDP the multiplier is less than unity. This can be used to justify a policy of fiscal austerity to tackle excessive fiscal imbalances.
But if a 1% fiscal tightening produces a decline of more than 1% in GDP, the multiplier is greater than unity and a policy of austerity becomes self-defeating. Prior to the Great Recession, the standard view was the multiplier was less than unity. However, much of the recent academic evidence indicates that pre-recession estimates may have been too low, which should give pause for thought in the fiscal policy debate. I will highlight the empirical analysis resulting from this paper on another occasion, but suffice to say that for countries such as Greece the multipliers appear to have been larger than initially assumed. This in turn has contributed to what can only be described as a Greek economic depression.
Governments and policy makers across Europe woke up this week to the fact that popular resentment is rising. The EU may have been able to dismiss the Brexit vote as a little local difficulty in a country which has never bought into its ideals. But it cannot ignore the message from the US electorate. It may be too late for Italy, which heads to the polls on 4 December. But the French election next spring now assumes even greater relevance for the future of the EU project.
Wednesday, 9 November 2016
The day of fate
In German, 9 November is known as Schicksalstag – the day
of fate – for it was on this day in 1848 that the politician Robert Blum was assassinated;
in 1923 Hitler launched his first failed attempt at a coup;
in 1938 Kristallnacht took place and in 1989 the Berlin Wall was opened. For the record, it was the day in 1921
that the Italian fascist party was formed. It will also be remembered as the
day that Donald Trump was elected as the 45th President of the
United States.
At 5.30 am this morning, as the rain was beating down against the window, markets also looked to be taking a beating as the news came through that Donald Trump was on his way to the White House. In the event the markets took the election in their stride. Compared to the chaos which ensued after the Brexit referendum, it was small beer. I can only assume that whatever reservations markets have about Trump’s plans, they know that today is not the day to express them. For one thing, he has to wait another two months before being given the keys to 1600 Pennsylvania Avenue. For another, markets learned a lesson after the Brexit vote that it may pay to digest the wider implications of the vote before making a move.
But as one of my colleagues noted today “if 1989 was the year that the walls came down, 2016 was (metaphorically) the year they went back up again.” Voters in the UK and US have now sent a strong message that their tolerance for globalisation has reached its limit. And with Italian Prime Minister Renzi next month staking his future on a constitutional referendum which he may lose, and general elections next year in Germany and France, the stage is set for a period of extreme uncertainty in western politics – and by extension economics.
Economic fears centre primarily on the new president’s attitude towards foreign trade. Unfortunately, globalisation has been seen as a zero-sum game in which there is one winner and one loser. This is far from the case, though try telling that to the former steel workers in the rust belt. Undoubtedly, manufacturing jobs have been shifted offshore although in the process US consumers are now able to gain access to high quality goods at lower prices than if they were produced locally. Clearly, their utility function is such that they would trade some of their better material prosperity for some of the jobs which have been lost. This does not make their economic argument right, or indeed wrong, though I suspect it tells us something that I have long suspected – namely, that attempts to measure consumer well-being in monetary terms defines too narrowly the problem of economic dislocation in the wake of globalisation. It was a failure of successive UK governments to comprehend this that led to the vote in favour of Brexit.
But where Trump is wrong is to suppose that starting a trade war will alleviate the problems. In a first act, he is likely to kill the Trans-Pacific Partnership, a trade pact involving 12 Pacific Rim nations, which Obama had hoped would be approved by Congress after the election. Indeed a press conference involving representatives from Australia, Japan, New Zealand and Singapore which was scheduled for tomorrow, has now been cancelled. This may be a recognition that progress is unlikely. Admittedly, ending the TPP would not involve any additional costs since it has not been ratified anyway.
But threats to pull out of NAFTA would be more problematic. Trump wants to impose steep tariffs on goods imported from Mexico in order to deter additional jobs from moving south. Economists continue to debate whether NAFTA is beneficial to the US. But it is likely that many of the jobs lost would have gone anyway, and with the US trade deficit with China roughly five times bigger than that with Mexico, around five times as many jobs have been lost to China as Mexico. Moreover, outsourcing some of the lower value added processes to Mexico and reimporting them into the final product has helped to keep US labour costs down. You can argue about it, but it does not seem as though NAFTA has been the killer that Trump has argued.
Threats to impose import tariffs on the likes of China would quite simply be very bad economics. For one thing, we know where that got us in the 1930s, as beggar-thy-neighbour policies depressed world trade. For another, China holds large swathes of the US bond market. Attempts to pick a trade war could have adverse consequences if the Chinese decided to dump their Treasury holdings.
Trump’s fiscal policies don’t add up either for they imply a lot of unfunded promises. We can come back to this at a future date but suffice to say that if you want to expand fiscal policy, it may not be a great idea to antagonise the Chinese too much: They might turn out to be the best customers for your bonds! Finally, and in an echo of the UK debate, relationships between Trump and Fed Chair Yellen are not exactly cordial which has given rise to speculation that Yellen could walk the plank when her term expires in January 2018. It would not be the first time that the White House and the Fed have clashed in this way but it does not send a particularly positive signal to the markets on the conduct of monetary policy.
All in all, markets may have dodged a bullet today but there is enough for them to worry about in future to suggest that today could only be the calm before a very big storm.
At 5.30 am this morning, as the rain was beating down against the window, markets also looked to be taking a beating as the news came through that Donald Trump was on his way to the White House. In the event the markets took the election in their stride. Compared to the chaos which ensued after the Brexit referendum, it was small beer. I can only assume that whatever reservations markets have about Trump’s plans, they know that today is not the day to express them. For one thing, he has to wait another two months before being given the keys to 1600 Pennsylvania Avenue. For another, markets learned a lesson after the Brexit vote that it may pay to digest the wider implications of the vote before making a move.
But as one of my colleagues noted today “if 1989 was the year that the walls came down, 2016 was (metaphorically) the year they went back up again.” Voters in the UK and US have now sent a strong message that their tolerance for globalisation has reached its limit. And with Italian Prime Minister Renzi next month staking his future on a constitutional referendum which he may lose, and general elections next year in Germany and France, the stage is set for a period of extreme uncertainty in western politics – and by extension economics.
Economic fears centre primarily on the new president’s attitude towards foreign trade. Unfortunately, globalisation has been seen as a zero-sum game in which there is one winner and one loser. This is far from the case, though try telling that to the former steel workers in the rust belt. Undoubtedly, manufacturing jobs have been shifted offshore although in the process US consumers are now able to gain access to high quality goods at lower prices than if they were produced locally. Clearly, their utility function is such that they would trade some of their better material prosperity for some of the jobs which have been lost. This does not make their economic argument right, or indeed wrong, though I suspect it tells us something that I have long suspected – namely, that attempts to measure consumer well-being in monetary terms defines too narrowly the problem of economic dislocation in the wake of globalisation. It was a failure of successive UK governments to comprehend this that led to the vote in favour of Brexit.
But where Trump is wrong is to suppose that starting a trade war will alleviate the problems. In a first act, he is likely to kill the Trans-Pacific Partnership, a trade pact involving 12 Pacific Rim nations, which Obama had hoped would be approved by Congress after the election. Indeed a press conference involving representatives from Australia, Japan, New Zealand and Singapore which was scheduled for tomorrow, has now been cancelled. This may be a recognition that progress is unlikely. Admittedly, ending the TPP would not involve any additional costs since it has not been ratified anyway.
But threats to pull out of NAFTA would be more problematic. Trump wants to impose steep tariffs on goods imported from Mexico in order to deter additional jobs from moving south. Economists continue to debate whether NAFTA is beneficial to the US. But it is likely that many of the jobs lost would have gone anyway, and with the US trade deficit with China roughly five times bigger than that with Mexico, around five times as many jobs have been lost to China as Mexico. Moreover, outsourcing some of the lower value added processes to Mexico and reimporting them into the final product has helped to keep US labour costs down. You can argue about it, but it does not seem as though NAFTA has been the killer that Trump has argued.
Threats to impose import tariffs on the likes of China would quite simply be very bad economics. For one thing, we know where that got us in the 1930s, as beggar-thy-neighbour policies depressed world trade. For another, China holds large swathes of the US bond market. Attempts to pick a trade war could have adverse consequences if the Chinese decided to dump their Treasury holdings.
Trump’s fiscal policies don’t add up either for they imply a lot of unfunded promises. We can come back to this at a future date but suffice to say that if you want to expand fiscal policy, it may not be a great idea to antagonise the Chinese too much: They might turn out to be the best customers for your bonds! Finally, and in an echo of the UK debate, relationships between Trump and Fed Chair Yellen are not exactly cordial which has given rise to speculation that Yellen could walk the plank when her term expires in January 2018. It would not be the first time that the White House and the Fed have clashed in this way but it does not send a particularly positive signal to the markets on the conduct of monetary policy.
All in all, markets may have dodged a bullet today but there is enough for them to worry about in future to suggest that today could only be the calm before a very big storm.
Monday, 7 November 2016
A brief history of currency unions
Monetary unions have a long history in international
economics and we can trace them as far back as that between Phocaea and
Mytilene in the late fourth or early fifth centuries BC. But it was from the
late eighteenth century that formal monetary unions began to proliferate,
partly as a way to consolidate political union but also to promote the
conditions for cross-border trade to flourish. Two of the more successful to
emerge from this period were the US monetary union which came into being with
the signing of the Constitution in 1789 (which later evolved into the dollar system)
and the Zollverein of 1834 which laid the foundations for German political and
monetary union in the 1870s.
History suggests that the most successful monetary unions are those which encompass what we would now define as the nation state. Without getting too philosophical about it, a shared language raises the likelihood that smaller regions will find sufficient common ground to form a political union. It is therefore no surprise that the US and German monetary unions have tended to be more durable than those which have looser ties. But as the experience of Belgium and Switzerland indicates, a successful currency union can still emerge from regions which are neither nation states nor share a common language.
However, strong monetary unions tend to be based on regions with common interests, often based around language – and almost always where currency issuance is controlled centrally. Thus the nineteenth century gold standard – which met neither of these criteria – eventually collapsed. There are some similarities between the gold standard and European Monetary Union. Admittedly EMU members share common political aims, if not a language, and the system is underpinned by a central issuer of currency in the form of the European Central Bank. But in both cases member countries are linked together in a system of fixed exchange rates and have given up monetary sovereignty to one degree or another. Whilst in EMU the operation of monetary policy has been fully contracted out to the ECB, under the gold standard individual countries at least retained their own central monetary authority, although in neither case do members have monetary autonomy and both systems require economic deflation as a cure for imbalances.
The post-1945 Bretton Woods system suffered from many of the same flaws, and although it made provision for devaluations (a feature which the UK twice utilised in 1949 and 1967) it was designed to be a painful experience. One of the problems evident with the classical gold standard was (to quote John Maynard Keynes) that adjustment was “compulsory for the debtor and voluntary for the creditor.” Despite Keynes’ best efforts to eliminate asymmetric adjustments, the Bretton Woods system operated under the same principle.
Whilst EMU is different to previous cross-border monetary systems because it has a central bank which controls currency issuance and provides a centralised payments system which helps to smooth out capital needs, the fiscal rules which underpin the system highlight other deep flaws. The Maastricht Treaty of 1992 contained a “no bailout” clause in which one country would not be held responsible for the debt of another, and these were enshrined in targets for deficits and debt relative to GDP. Not only were debt targets ignored – after all, Italy and Belgium joined when debt ratios were around 100% versus a stipulation that it should be below 60% – but the “no bailout“ clause was deeply flawed in the first place. In an integrated economy such as the EU, one country’s debts largely represent the assets of another. Consequently, the no bailout clause was never going to hold in the long-term unless the creditor countries were prepared to take a degree of pain in the event that others experienced debt problems.
Moreover, no attention was paid to external imbalances during the EMU entry process. And we should have known better, since it was external imbalances which eventually did for Bretton Woods. An economy such as Greece, which was reliant on international inflows to cover its external deficit was always vulnerable to a sentiment shift such as occurred in 2008. EMU is thus subject to the Achilles Heel of previous systems – how to manage current account imbalances in a system of fixed exchange rates. The painful truth is that we cannot unless surplus countries are prepared to recycle liquidity to finance the debt of others.
It is for this reason that the huge surpluses being built up by the likes of Germany pose such a threat to the existence of the single currency. Whilst the EU and IMF call for Germany to expand its fiscal policy in a bid to stimulate demand, and thus help to alleviate the imbalances, we may not even have to go that far. A simple recycling of the surplus by other means will suffice – perhaps via the banking sector, which after all funded the deficit countries prior to 2008. However, the European banking system is not in sufficiently good shape to perform the same role today. So if surplus countries are not prepared to loosen their fiscal stance, the EU’s warning issued earlier this year may yet come back to haunt the single currency region: “[Germany’s] persistently high current account surplus … accounts for three quarters of the euro area surplus [and] has adverse implications for the economic performance of the euro area.” As the economist Herbert Stein once warned “if something cannot go on forever, it will stop.”
History suggests that the most successful monetary unions are those which encompass what we would now define as the nation state. Without getting too philosophical about it, a shared language raises the likelihood that smaller regions will find sufficient common ground to form a political union. It is therefore no surprise that the US and German monetary unions have tended to be more durable than those which have looser ties. But as the experience of Belgium and Switzerland indicates, a successful currency union can still emerge from regions which are neither nation states nor share a common language.
However, strong monetary unions tend to be based on regions with common interests, often based around language – and almost always where currency issuance is controlled centrally. Thus the nineteenth century gold standard – which met neither of these criteria – eventually collapsed. There are some similarities between the gold standard and European Monetary Union. Admittedly EMU members share common political aims, if not a language, and the system is underpinned by a central issuer of currency in the form of the European Central Bank. But in both cases member countries are linked together in a system of fixed exchange rates and have given up monetary sovereignty to one degree or another. Whilst in EMU the operation of monetary policy has been fully contracted out to the ECB, under the gold standard individual countries at least retained their own central monetary authority, although in neither case do members have monetary autonomy and both systems require economic deflation as a cure for imbalances.
The post-1945 Bretton Woods system suffered from many of the same flaws, and although it made provision for devaluations (a feature which the UK twice utilised in 1949 and 1967) it was designed to be a painful experience. One of the problems evident with the classical gold standard was (to quote John Maynard Keynes) that adjustment was “compulsory for the debtor and voluntary for the creditor.” Despite Keynes’ best efforts to eliminate asymmetric adjustments, the Bretton Woods system operated under the same principle.
Whilst EMU is different to previous cross-border monetary systems because it has a central bank which controls currency issuance and provides a centralised payments system which helps to smooth out capital needs, the fiscal rules which underpin the system highlight other deep flaws. The Maastricht Treaty of 1992 contained a “no bailout” clause in which one country would not be held responsible for the debt of another, and these were enshrined in targets for deficits and debt relative to GDP. Not only were debt targets ignored – after all, Italy and Belgium joined when debt ratios were around 100% versus a stipulation that it should be below 60% – but the “no bailout“ clause was deeply flawed in the first place. In an integrated economy such as the EU, one country’s debts largely represent the assets of another. Consequently, the no bailout clause was never going to hold in the long-term unless the creditor countries were prepared to take a degree of pain in the event that others experienced debt problems.
Moreover, no attention was paid to external imbalances during the EMU entry process. And we should have known better, since it was external imbalances which eventually did for Bretton Woods. An economy such as Greece, which was reliant on international inflows to cover its external deficit was always vulnerable to a sentiment shift such as occurred in 2008. EMU is thus subject to the Achilles Heel of previous systems – how to manage current account imbalances in a system of fixed exchange rates. The painful truth is that we cannot unless surplus countries are prepared to recycle liquidity to finance the debt of others.
It is for this reason that the huge surpluses being built up by the likes of Germany pose such a threat to the existence of the single currency. Whilst the EU and IMF call for Germany to expand its fiscal policy in a bid to stimulate demand, and thus help to alleviate the imbalances, we may not even have to go that far. A simple recycling of the surplus by other means will suffice – perhaps via the banking sector, which after all funded the deficit countries prior to 2008. However, the European banking system is not in sufficiently good shape to perform the same role today. So if surplus countries are not prepared to loosen their fiscal stance, the EU’s warning issued earlier this year may yet come back to haunt the single currency region: “[Germany’s] persistently high current account surplus … accounts for three quarters of the euro area surplus [and] has adverse implications for the economic performance of the euro area.” As the economist Herbert Stein once warned “if something cannot go on forever, it will stop.”
Sunday, 6 November 2016
The rhymes of history
“More than the success of the Brussels negotiations is imperilled by the divisions of the western world … The western alliance lies spread-eagled after losing both common purpose and mutual confidence.” It could have been written yesterday. In fact it was written almost 54 years ago and forms the opening lines of the Glasgow Herald editorial from 15 January 1963. The context of the article was French President de Gaulle’s decision to reject British attempts at membership of what was then called the EEC, whilst also rejecting US overtures to provide the weaponry to help defend Europe which he regarded as usurping the French place on the world stage.
Fast forward to 2016 and we are again debating the nature of Britain’s relationships with its European partners, whilst the extent to which the US is prepared to underwrite Europe’s military defence has been one of the issues in Donald Trump’s US presidential campaign. In many ways, de Gaulle’s fears look remarkably prescient. When asked what was France’s position regarding Britain’s entry into the Common Market, he replied “The Treaty of Rome was concluded between six continental States, States which are, economically speaking … of the same nature … The entry of Great Britain … will completely change the whole of the actions, the agreements, the compensation, the rules which have already been established between the Six … Then it will be another Common Market … which would be taken to 11 and then 13 and then perhaps 18 [and] would no longer resemble, without any doubt, the one which the Six built.”
To put it simply, de Gaulle foresaw that the entry of the UK would be the thin end of an expansionary wedge which would change the nature of the European project. Historians argue about the old man’s motives but there is little doubt that the UK has never sat comfortably within the EU. Moreover, the eastward expansion in 2004, which almost doubled the number of member states, did indeed change the nature of the project as de Gaulle predicted.
I find this whole debate fascinating because it highlights that many of the problems which we face in the US and Europe today could usefully use a little historical perspective. We can debate whether the trend in the US towards retreat from some of the world’s more intractable problems has echoes of the isolationism of the 1930s with all its attendant consequences, though I have no intention of doing so here. For those interested in a more detailed take on US foreign policy, I would recommend organisations such as The Foreign Policy Association (here). Suffice to say that we should have a better idea next week of the direction in which US foreign policy is likely to evolve.
But when it comes to European issues, it is safe to say that never in my lifetime has the continent appeared so febrile and policy so lacking in direction. In recent years, the EU has suffered a crisis of confidence brought about initially by the Greek debt crisis and latterly by a huge refugee influx. In part, this reflects the over-confidence of the 1990s which prompted the EU to expand too far, too fast. It also reflects policy mistakes, particularly with regard to the economy. Indeed, it increasingly appears that the construction of monetary union failed to adhere to de Gaulle’s message that forcing disparate countries together in a single economic system was a recipe for disaster. This came about because policy makers across Europe used economic structures for political purposes. But as every economic historian knows, fixed currency arrangements tend to end in tears (I will explore this in a future post).
The solution to many of the EU’s economic woes is more federalism, but the tide of public opinion is swinging back in the other direction. It increasingly looks as though the EU built the foundations of the structure which it wanted to become but delayed for too long in building the walls, let alone the roof. The tide of history is now moving too quickly for the EU to resurrect the ideas of the 1990s but a return to the Common Market ideal of 1957 also appears unpalatable to many European politicians today. Yet the latter option may be the only solution which is ultimately workable for a Europe riven with disparate economic and political goals.
As for the UK, it is quite clearly a nation ill at ease with itself. Half its voters want to secede from the EU and half do not. Yet such is the febrile mood today that when the judiciary rules parliament must have a say in how the country leaves the EU, it is accused by the Fourth Estate (the unelected and self-styled voice of the people) of being an enemy of democracy. Lest we forget, the early years of the Thatcher government were also a period of extreme social unrest but I cannot recall an atmosphere as poisoned as the one we have today. Economic rationality (if such a thing can be said to exist) is being totally ignored as we debate our economic future.
Mark Twain is reputed to have said that history never repeats but it does rhyme. But the poem being written today is not of the epic variety. It is closer to the worst kind of doggerel.
Subscribe to:
Posts (Atom)