Wednesday, 23 December 2020

Bad but by no means the worst

By the standards of our lifetimes this will be one of the more unusual Christmases we have ever experienced. Large parts of Europe are living under lockdown conditions and many millions of people will be separated from their extended families, in many cases not having seen them since last Christmas. Undoubtedly many of us have been inconvenienced but spare a thought for those who have lost loved ones during the year. Spare a thought, too, for those in front line service jobs who have worked to look after us and provide the services that have kept the economy afloat. Health professionals (doctors, nurses and the ancillary staff who keep the system running) have had a tough year and they will continue to work over the holiday season. Special thanks are also due to those who have kept the lights on, delivered the goods to our doorstep and the countless other services which have allowed us to maintain a semblance of normality in 2020.

All of this did get me wondering how badly this year’s Christmas stacks up against past years. It certainly will not be the worst in history. The outbreak of bubonic plague in Europe between 1346 and 1353, which is estimated to have killed 60% of the population, would have made for a truly frightening experience. In a forerunner of today's social distancing, cities such as Venice and Milan put emergency public health measures in place to limit personal contact whilst the Adriatic port city of Ragusa (modern-day Dubrovnik) was the first to pass legislation in 1377 requiring the mandatory quarantine of all incoming ships and trade caravans in order to screen for infection. Well-documented outbreaks of bubonic plague afflicted large European cities over the next 300 years with outbreaks in London in 1563 and 1665 particularly noteworthy. Ironically, plague outbreaks tended to subside during the winter months as the disease vectors (rats and their fleas) retreated in the wake of colder weather. Christmas may thus have seemed a miraculous interlude in an otherwise endless cycle of misery.

Christmas during periods of war are also dreadful. We have been regaled over the decades with stories about how bad Christmas was in the UK during World War II with 1944 described as “the most joyless Christmas of the war.” One amusing anecdote from that period is that such was the lack of alcohol that year “that, of the half million inhabitants of Kensington, Hammersmith, Fulham and Chelsea, in London, only one woman was arrested that year for drunkenness over the holiday.There was nothing to laugh about in Germany in 1944. Supply shortages were almost intolerable in cities which were ruined shadows of their former selves whilst weekly working hours had recently been increased to 60 hours to free up labour for the war effort.

Of course, war and disease were not the only dampening factors on Christmas celebrations. Before the Reformation in 1560, Christmas in Britain was celebrated as a religious feasting day. But the rise of the Puritan movement in the seventeenth century meant that the season was increasingly frowned upon as a frivolity associated with Roman Catholicism. In 1640 the Scottish Parliament passed a law that made celebrating ‘Yule vacations’ illegal. In 1643 the English parliament followed suit, passing a law calling on the people to treat the mid-winter period 'with the more solemn humiliation because it may call to remembrance our sins, and the sins of our forefathers, who have turned this feast, pretending the memory of Christ, into an extreme forgetfulness of him, by giving liberty to carnal and sensual delights'. In 1644, parliament followed this up by abolishing the feasts of Christmas, Easter and Whitsun altogether. From this point until the Restoration in 1660, Christmas in England was officially illegal. Even after the law was repealed in Scotland, Christmas celebrations were frowned upon for a long time afterwards. It was not until 1958 that 25 December became a Scottish public holiday.

Many of the non-conformist Puritan movements which left Europe for the New World in the seventeenth century carried this attitude with them. The Plymouth Pilgrims in 1620 spent their first Christmas Day, in what later became the United States, building their first structure. The following year new arrivals, who spent Christmas Day celebrating rather than working, found themselves at odds with the original settlers. It was not until 1681 that laws forbidding the celebration of Christmas were repealed. But “as late as 1870, classes were scheduled in Boston public schools on Christmas Day and punishments were meted out to children who chose to stay home beneath the Christmas tree.”

However inconvenient this Christmas will turn out to be, for the most part it pales into insignificance compared with past privations. Stay safe and enjoy the festivities as best you can – even if is only via Zoom. All the best to you and yours at this very unusual time.

Monday, 21 December 2020

There may be trouble ahead

It's not what was done ...

I have never known such a sombre mood in the UK as that which prevails today. As if 2020 has not been bad enough, the weekend news that the government has cancelled the planned five day relaxation of social distancing restrictions over Christmas in response to rising infection rates has thrown the plans of millions into chaos. This was done with the best of intentions in the face of a new variant of SARS-Cov-2 which appears to be more infectious than previous strains. But in response more than 40 countries have, at the time of writing, placed bans on travellers arriving from the UK to limit the spread of the new variant. The most serious of these is the French decision to impose a 48-hour ban on passengers and freight entering from the UK which will severely disrupt cross-border trade.

The first reaction of many people was to direct their anger at the government. After all they were promised just three days earlier by Boris Johnson that it would be “inhuman” to ban Christmas as he defended plans to allow households to socialise over the festive period (the fact that local lockdowns in late July were announced hours before the start of the Eid festival did not go unremarked on social media). That said, we should cut the government some slack regarding the decision to impose new restrictions in the face of the most serious health crisis in a century. Many people may disagree, but the experience of the first lockdown was that it did result in a significant reduction in the spread of the disease, albeit at a very high economic cost. Those arguing that the UK should have followed the Swedish model are less vociferous in the face of mounting acceptance in Sweden that the government’s strategy was a mistake, with even the King suggesting that the policy has “failed”.

... but how it was done

A far bigger problem has been the government’s communications strategy. The decision on Saturday afternoon to add a fourth tier of restrictions to the 3-tier system with just a few hours’ notice seemed very rushed. Worse, it flew in the face of the message given just three days earlier. Since the government has known about the new Covid variant for some time, it calls last week’s comments defending previous Christmas plans into question. However, this is in keeping with the pattern which Johnson has followed throughout the year. He was late in implementing the first lockdown in March; he resisted the scientists’ calls for a national lockdown in September, instead opting for a series of badly implemented regional lockdowns, before being forced to bow to the inevitable and implementing a second lockdown in November and now the latest U-turn.

Preparing for border disorder

But it is the restrictions on the flow of goods and people across borders which are the most sobering aspect of the whole issue. Even before the events of recent days queues were mounting on both sides of the Channel as firms attempted to build up stocks ahead of disruption in the event of a no-deal Brexit. One of the consequences has been that the cost of transporting a container of goods has significantly increased, with reports that a container of goods from China to Felixstowe now costs $10,000 per load – four times the usual rate. The French border closure has made the problem significantly worse because hauliers have no incentive to enter the UK for fear of being stuck on this side of the Channel. None of this should come as any real surprise. I did point out two years ago that problems at the ports would quickly lead to large queues.

It may be that the border closure is partly motivated by the desire of the French government to fire a warning shot at Downing Street to indicate what could happen in the event of a no-deal Brexit. Contrary to what the diehards have maintained over the last four years, the UK really does not hold all the cards – it is questionable how many it holds at all. In the absence of either a Brexit deal or an extension of the transition period, this could be just a foretaste of what is to come. Latest reports from within government suggest that the UK has ruled out any Brexit extension. Given Johnson’s record on U-turns, we should not necessarily take this at face value. But if this really is the government’s position, it should brace itself for the mother of all political backlashes in 2021. It is extremely difficult to believe that voters will stand idly by whilst restrictions on cross-border traffic cause such inconvenience, resulting in higher prices and a reduction in the range of goods available for purchase.

Interestingly, a recent IMF working paper looked at pandemics across a range of countries over the period 2001 to 2018 to assess whether they lead to higher inequality and increased social unrest. It concluded that “the results from local projections show that social unrest increases about 14 months after pandemics on average. The direct effect peaks in about 24 months post-pandemic.” Add in the self-inflicted pain of a senselessly hard Brexit and I would not want to be in Johnson’s shoes in 2021.

It's nothing  personal - I just oppose incompetence

I was recently accused of peddling Anti-Tory propaganda. Since the respondent was anonymous I am sure they will not remind me repeating their response to one of my blog posts: “From the very first words of this article, it's glaringly obvious that the writer is a remoaner. The colouring of the language clearly lays a foundation for the rest of the article to be another Brexit/Tory-bashing tiresome monologue. So, it puts me off. It didn't start as balanced, so I (and I'm guessing many others) didn't read through, because they already knew the theme and conclusion of the story. Shame. There may be many salient points buried within these 1489 words, but I won't go in search for them. I have better things to do with my time.”

Whoever they are, they have missed the point of everything I have written over recent years. My criticisms are not party political (they should read what I wrote about Jeremy Corbyn). They are a response to government incompetence. It is not my intention to take pot shots at the government for the sake of it – I leave that job to the professional columnists with this article by The Observer’s Andrew Rawnsley neatly summarising Johnson’s unsuitability for leadership at a time when more than flowery rhetoric is required. As Rawnsley put it, if there is light at the end of the tunnel it  will have to be exceedingly bright to wipe away all the memories of how long and dark, stumbling and flailing has been the nation’s journey through the tunnel.

Friday, 18 December 2020

Something of value

The Reith Lectures are a long-standing tradition in British radio broadcasting, running back to 1948, in which a leading figure of the day tackles a subject of contemporary interest. One of the consequences of the lockdown is that I had a chance to listen to this year’s lecture series given by Mark Carney, former BoE Governor, and very interesting it was too, for it tackled the issue of how financial value has usurped human value and what we can do to turn this around (the transcripts of his lecture series are also available at the link shown above).

What is value?

This subject is of relevance to all economists, but it is of particular interest to me because as I have noted previously, it is one of the motivating factors behind starting this blog in the first place. Carney’s jumping-off point is to acknowledge that the moral sentiments espoused by Adam Smith, the father of the invisible hand, have become financial sentiments and that “societies’ values became  equated  with  financial  value.” I have raised similar points over the years, arguing in 2017 that Smith “never advocated the devil-take-the-hindmost policy which many of his adherents claim.” Indeed, Carney notes that Smith uses the phrase “invisible hand” only once in his magnum opus The Wealth of Nations. My own introduction to Smith’s work (more years ago than I am prepared to admit) focused on his role in developing the idea of comparative advantage in trade rather than his espousal of free markets – a point that Carney reiterated: “the central concept that links all of Smith’s works is the idea that continuous exchange forms part of all human interactions.”

A few weeks ago I referenced a study conducted by the ESCoE into public attitudes towards economics in which those questioned “often associated the economy with money.” Carney highlights that “Smith’s writings warn of the mistakes of equating money with capital.” Somewhere along the line we appear to have drifted a long way from the original ideas sketched out by one of the founding fathers of modern economics. One of the underpinnings of this shift has been a change in the nature of value. In Carney’s interpretation of Smith’s world, value is derived from our desire to be well regarded by others which creates “incentives to achieve mutual sympathy of sentiments.” In recent years, however, value has taken on a more subjective hue as the neoclassical revolution has gone mainstream. In this scheme, people are encouraged to assign a value according to the utility they assign to a particular good (or service). In other words, value becomes what people are willing to pay. What complicates matters enormously is that since tastes can change very quickly, these values are not stable over time. But following the Reaganite/Thatcherite revolution of the early-1980s which unleashed the power of markets as the ultimate arbiter of choice, this model of value generation has become extremely well entrenched.

However, markets are underpinned by the laws and values of the society in which they operate. They do not spring out of nowhere – the invisible hand must be attached to an invisible arm. A good example of this is the Glass-Steagall Act of 1933, which separated the deposit taking activities of US banks from more risky investment banking operations. It came into being because society was not prepared to condone a repeat of the 1929 Wall Street Crash and the associated economic hardship. Fast forward to 1999 and society’s concerns about the risks associated with banking had diminished to the point at which the US Congress felt able to repeal it. Quite clearly the law operated in the context of the prevailing social norms.

By contrast, Milton Friedman was an arch-proponent of free markets who argued that we could separate market outcomes from their social context. In a famous article published in the New York Times 50 years ago he argued that a business executive who exercises social responsibility in the course of their work “must mean that he is to act in some way that is not in the interest of his employers.” Businesses that do anything other than maximise profits are “unwitting puppets of the intellectual forces that have been undermining the basis of a free society these past decades” and are guilty of “analytical looseness and lack of rigor.” But Friedman’s failure to take account of social context is a major omission. What might have been acceptable corporate behaviour in  the 1970s and 1980s no longer is. Moreover, a purely market oriented policy that fails to take account of social norms would damage a company’s image and be harmful to its long-term survival prospects.

One of the problems with our current system of value setting is that since we define the worth of activity purely in financial terms, we cease to value that which we cannot price. Accordingly mainstream economics now tries to assign a price to civic and social virtues in order to give them meaning. In this way, we have gone beyond operating in a market economy and we are now operating in a market society. But as Carney points out, “there is considerable evidence that commodification, putting a good or service up for sale, can corrode the value of the activity being priced.” The standard example of this are charitable events which seek to raise money for good causes. The primary motivation is altruism, but would more money be raised if participants were paid? The answer, it turns out, is no.

Will Covid change how we value things?

These philosophical ramblings are all very interesting but what bearing do they have on the issues we face today? In a world where all members of society have been affected in one way or another by Covid, many people have done extraordinary things to help their communities, for which they received no payment. Milton Friedman might have argued that they were crazy, giving away their labour for no monetary reward. But they did so because they believed in the cause for which they were working. In a wider sense, Covid will force societies to re-evaluate their priorities. In the Anglo Saxon world, the burden of risk has increasingly been shifted onto the individual as the state has reduced its role in the economy. For example, if you lose your job, you have to find another one quickly because the state will not provide for you. But during the pandemic, that is precisely what the state has been forced to do. Does society really want to revert to what went before?

I recently gave a presentation in which I suggested we may be about to relive a 1945 moment. At that time in the UK, memories of high unemployment in the 1930s were still vivid and voters in the first post-war election opted for a government that promised radical social change. Radical post-Covid change will likely be driven by younger voters who wish to see changes to an economic model that has benefited their parents’ generation but done little for them. For example, voters may demand that the state plays a bigger role in the economy in future, since one of the things the state does well is to correct the market failure from negative externalities (e.g. ensuring widespread access to healthcare and education). This in turn could have a major bearing on the way in which society assigns values and would give the green light for future governments to adopt a slow course back to fiscal rectitude rather than a headlong rush.

Obviously we do not know what the future holds but I have long argued that more market-type solutions are not the answer to the mounting economic problems faced by European economies. This is not to say that we should revert to 1970s-style efforts to centralise economic decision-making. But as we learned in 2008 and again in 2020, markets can fail without the right kind of support. The fact that we value intervention in order to avoid worst case spillover effects suggests that we should not be afraid to impose limits on the extent to which we allow free markets to make our value judgements for us

Thursday, 10 December 2020

Machine learning: A primer

Last month I had the pleasure of taking part in a virtual conference organised by the Bank of England on big data and machine learning (ML). One of the things that struck me most was the relative youth of the presenters, many of whom are still writing their PhDs. This is a clear illustration of the fact that this is a brand new field whose limits are being extended every month and which is increasingly being applied in the field of economics and finance. If ever you want to get in on the ground on what promises to be one of the new fields of economic analysis, now is a good time to get started. 

Some basics 

Big data and ML go hand-in-hand. The development of web and cloud based systems allows the generation and capture of data in quantities which were unimaginable just a few years ago. It is estimated that 59% of the global population are active internet users – around 4.66 billion people. Every second they send 3.4 million emails, 5700 tweets and perform almost 69,000 Google searches. PwC reckoned that in 2019 there were 4.4 Zettabytes (ZB) of data stored online – a figure that could hit 44ZB in 2020. A decent laptop these days will have a hard disk with one Terrabyte of storage capacity but you would need more than 47 billion of them to store the current volume of our data universe (44 x 10243). If these were stacked one on top of another, it would generate a column over 1 million kilometres high – three times the distance to the moon. Clearly, a lot of the data stored online does not yield any valuable insight but given the vast amount of available information even a small fraction of it is still too much for humans to reasonably digest.

This is where the machines come in. Traditional computer programs represent a series of instructions designed to perform a specific task in a predictable manner. But they run into difficulties in the case of big data applications because the decision trees built into the program (the “if-then” loops) can simply become too big. Moreover, a traditional program represents a fixed structure which goes on doing the same thing ad infinitum which may not be ideal in a situation where we gather more data and begin to understand it better. A machine learning algorithm (MLA) is designed to be much more flexible. Rather than being based on a series of hard-coded decision rules, an MLA incorporates a very large number of basic rules which can be turned off and on via a series of weights derived via mathematical optimisation routines. This makes MLAs more successful than traditional computer programs in areas such as handwriting and speech recognition and are better able to deal with tasks such as driving where rapid adjustment to changed conditions is required.

But a machine needs to be trained in order to progressively improve its performance in a specific task in much the same way that humans learn by repetition. In the AI community there are five broad categories of training techniques, the most common of which is supervised learning in which input and output data are labelled (i.e. tagged with informative labels that aid identification) and the MLA is manually corrected by the human supervisor in order to improve its accuracy[1]. One of the common problems is that the model might fit the training data very well but be completely flummoxed when faced with out-of-sample data (overfitting). By contrast an underfitting model cannot replicate either the training data or the out-of-sample data which makes it useless for decision making.

Our final task is to ensure that the MLA has learned what we want it to. In one early experiment, data scientists tried to teach a system to differentiate between battle tanks and civilian vehicles. It turned out that it learned only to differentiate between sunny and cloudy days and it proved to be useless in real world situations. This demonstrates the old adage that if you ask a stupid question, you get a stupid answer, and highlights the importance of setting up the MLA in order that it focuses on the question of interest. 

Applying ML to economics 

How is any of this relevant to economics? First of all ML has the potential to revolutionise our statistical analysis of big datasets. In particular, certain applications should make it easier to reduce the dimensionality of big datasets, making them more manageable whilst still retaining the meaningful properties of the original data (see below). This is important because large datasets are often “sparse” i.e. a number of non-zero observations surrounded by large numbers of zeros, which tends to be a hindrance to many traditional statistical estimation methods.

ML also theoretically allows us to estimate and compare a range of models more easily. In applied economics, researchers normally start by choosing a single functional form and putting their efforts into a statistical assessment of whether the data agree with their preconceptions. It is usual for researchers to operate with only one model given the labour intensive nature of the exercise and it becomes a laborious task to compare a range of models using traditional analytical techniques. However, ML should make it easier to compare a range of different models. In a very interesting paper on the application of ML techniques to economics, Susan Athey argues that a more systematic approach to model selection facilitated by ML, which removes the arbitrary approach to specification searches, “will become a standard part of empirical practice in economics” in future. 

An example 

In order to give a flavour of the application of ML techniques, I present here an example of a supervised learning technique known as a random forest model. This is a form of clustering model in which a large number of data observations are reduced down to a smaller number of groups in an example of dimensionality reduction.

To conceptualise a random forest model, think of a decision tree formed by splitting our dataset into two (see chart above). Both halves can be further divided into sub-categories until at some point we run out of ways to split them further (in other words there is no additional information content). In a standard decision tree model, since the trees are derived from the same underlying data, they may not be completely independent from each other. If, however, we randomly sample data from each tree created by the model it can be shown that this reduces the degree of bias compared to a standard decision tree (for those interested in a more detailed discussion, this paper from the BoE is very accessible). We “train” our model by allowing it to operate on a sub-sample of our dataset and apply the “knowledge” gained during the training period to the rest of the sample to see whether it makes accurate predictions. 

The Bank of England applied ML techniques in a paper using random forest models to predict banking distress. Based on this blog post by Saulo Pires de Oliveira, we can demonstrate exactly the same techniques used in the BoE paper to show the random forest model in action. It is written in the R software system and rather than use financial data, we use the famous Anderson iris data set which looks at the characteristics of three variations of irises (the data comes as standard in the R system). Our objective is to determine on the basis of the characteristics (the length and the width of the sepals and petals) which category of plant each observation belongs to. The code is available below and since R is free to download it is a simple matter of copying this code into R and running it to reproduce the results.

 

The model uses part of the dataset as input to a training algorithm and applies the results to the rest of the sample. How do we know whether our results are any good? Following the BoE example, we calculate the Receiver Operating Characteristic (ROC) curve which plots the true positive rate against the false positive rate (see chart below). The former is high and the latter low, suggesting that the model performs well in determining which class of iris the data correspond to. This is confirmed by a cross-check against the area under the curve algorithm which shows a value of 98% (the higher the value the better the fit).

Whilst this is a very simple example it does show the power of ML techniques. From an economists' point of view their use as a statistical technique for classifying patterns makes them an extremely powerful new tool. But we should beware of overdoing the hype when it comes to their use in some other areas. Many of the more general problems in cognition are not classification problems that MLAs are good at solving. Moreover, they tend to be data hungry whereas a human can learn abstract relationships with much less data input. One thing that humans still do better than machines is adaption and they are not going to replace us any time soon. But for the statisticians they are likely to be a boon.


[1] The others are semi-supervised learning in which the data are unlabelled but the MLA is still subject to manual correction. An active learning approach allows the MLA to query an external information source for labels to aid in identification. Unsupervised learning forces the MLA to find structure in the input data without recourse to labels and without any correction from a supervisor. Finally reinforcement learning is an autonomous, self-teaching MLA that learns by trial and error with the aim of achieving the best outcome. It has been likened to learning to ride a bicycle, in which early efforts often involve falling off but fine-tuning actions which gradually eliminate mistakes eventually lead to success.

Sunday, 6 December 2020

Five to midnight


The game of chicken continues …
 

It is eight years since I responded to a query from the FT asking whether we should worry that the UK might leave the EU in the years ahead. My response was that we ought to worry very much about “Brixit” (as we were then calling it) “as it is a critical issue, the consequences of which need to be thought through very carefully.” I concluded with a phrase that I have continued to use ever since that “the EU is far from perfect, but life on the outside may be even harder.” Eight years on and Brexit proponents have still not thought through the consequences and there is no recognition from them that life on the outside will be harder.

We are now 25 days from year-end; the point at which the UK’s trading arrangements with the EU will change irrevocably and there is still no trade deal in place. I am in no position to say whether the latest impasse in trade talks is the fault of the British government or, as it claims, is the result of unreasonable demands by the French. On the basis of past performance, in which the British government has adopted a highly confrontational approach to negotiations and whose politicians are not known for being straight on Brexit matters, we would be wise not to take the British view at face value. That said, EU governments have no interest in making life easy for the UK – something that has been obvious since the very start. The likes of John Redwood MP who said in 2016 that “the UK holds all the cards” have always had a deluded view of how difficult negotiations would be.

For all that I maintain the British government is largely at fault for the shambolic state into which negotiations have sunk, we should not overlook the EU’s desire to show that it can also play hardball. It wants to demonstrate that the 27 member states will determine the conditions of access to the EU domestic market. After all, one of the essential truths of trade negotiations is that might is right. It is also possible to imagine that the EU wants to cut the UK down to size by forcing a no-deal Brexit at the end of December so that it is forced to come back to the negotiating table in January in a far weaker position. I did point out in 2019 that Boris Johnson was the last person to whom the EU wants to offer any concessions given that he is perceived to be one of the figureheads of the Brexit movement. If ever there was a way to demonstrate to him that it is not possible to have one’s cake and eat it, a hard Brexit would be an ideal lesson.

But if the UK does leave without a trade deal at the end of December, this is unlikely to be a permanent state of affairs. In late-2018 when attempting to quantify the costs of a no-deal Brexit, I looked at a scenario in which the economic effects were so nasty that the UK capitulates almost immediately – in effect, a one-period no-deal Brexit. The initial shock in this instance would be quite traumatic and over a two-year horizon output would remain more than 1% below baseline levels before starting to recover more rapidly after three years (chart below). However, back then we were looking for the trade deal to be more comprehensive than the one the government is currently pursuing. Given that the deal currently in negotiation provides relatively little protection for many sectors of the economy, we do have to wonder whether the outcome of a skinny Brexit deal will be significantly better than a default to WTO trading arrangements.

… and the blame game intensifies 

It comes as no surprise that as the clock ticks down and the “easiest trade deal in human history” remains elusive, so both sides of the Brexit divide are blaming each other for the position in which we now find ourselves. Peter Mandelson recently wrote in The Guardian that as a Remainer he played his part in the disaster that subsequently unfolded by trying “to reverse the referendum decision rather than achieve the least damaging form of Brexit.” Mandelson can speak for himself but he does not speak for most Remainers who argued repeatedly that the government should accept the least damaging form of Brexit rather than fight old battles. Indeed that is the position I have adopted on these blog pages over the past four years.

But Mandelson’s admission has allowed Brexit supporters to disingenuously claim that it was the Remainers fault all along – a view which is now trending heavily on social media. However, let us remind ourselves of what has happened since mid-2016. Was it the Remainers who took a non-binding referendum and treated the narrow victory as a mandate to impose a winner-take-all outcome? By not trying to unify the country before triggering Article 50 and not setting out a vision of what it wanted from the process, the government has failed to provide leadership all the way along. As I have long pointed out, Theresa May made a huge mistake by prioritising the unity of her party over that of the country and it is this which has led directly to many of the issues we face today.

First of all, her government proposed eliminating any parliamentary oversight of the Brexit process. It required the intervention of Gina Miller and colleagues to prevent the biggest power grab in modern history. This provoked a lot of sound and fury from the Daily Mail and set the tone for what followed. But to have allowed the government to implement its Henry VIII clause unchallenged would have been profoundly undemocratic. The subsequent parliamentary debates were to a large extent unedifying but given the narrowness of the referendum result, it is only right that they took place. MPs work on behalf of the public and the voices of the 48% needed to be heard, even if many did not like what they had to say.

However, the biggest problem was the decision to leave the single market which led directly to the current wrangling over the form of trade agreement. Not only was this not on the ballot paper in 2016 but voters were explicitly promised by the likes of Daniel Hannan that “absolutely nobody is suggesting that we give up our position” in the single market. May then compounded her errors by holding an election which eliminated her parliamentary majority, rather than focusing on the Brexit task at hand. There was no doubt she should have held the election before triggering Article 50. Much of the discontent thus flows from the mistakes made in the nine months after the referendum. Sure, Remainers were angry but they felt that their voices were not being heard as Leavers pressed on without showing any willingness to compromise. The “you lost, get over it” mantra was no way to address the many legitimate issues being raised.

Boris Johnson has added to the difficulties. It was his government that decided not to extend the Brexit transition process in H1 2020 despite the fact that the corona pandemic would have given him adequate cover to do so. Having drawn that red line, it was his promotion of the Internal Market Bill, which by the government’s own admission breaks international law, which complicated dealings with the rest of the EU. And the fact that he operates a government with a parliamentary majority of 80 means he has not faced any domestic opposition on Brexit related matters over the last 12 months. The fact we are where we are comes down to a litany of government errors, compounded by a lack of political judgement and leadership.

Undoubtedly there will be people who read this and say “but it’s not about economics, it’s about the right to self-determination.” More than 40 years ago the people of Iran decided that they wanted to exercise self-determination by ridding themselves of a regime that made the country a vassal of the US. They ended up with a theocracy which then proceeded to repress and impoverish the people. The UK is not Iran but the principle of “be careful what you wish for” still applies. There is still time for the UK to avoid the worst case Brexit outcome but it requires a degree of leadership that has been sadly lacking for a long time. I, for one, have no idea how this ends. But if it goes wrong, we know where the blame lies.

Friday, 27 November 2020

Happy to do my bit

This week marked another of those set piece UK fiscal events that are so beloved of politicians, journalists and a large number of economists with the publication of the government’s Spending Review. The media focus was on the OBR’s forecasts which highlighted that the UK is set to experience its worst drop in annual output since the Great Frost of 1709 (around 11%) and the biggest fiscal deficit since 1944-45. Neither of these come as a surprise to those who have been following the UK economy in 2020 and the macro picture painted by the OBR for 2020 and 2021 largely accords with my own, so I found it rather difficult to get excited about the big picture.

That did not stop TV news editors and newspaper journalists from focusing on lurid headlines demonstrating the impact of Covid-19 on UK economic prospects. But in my view, the narrative around the outlook was more interesting. In this context I was particularly intrigued by comments from the BBC’s chief political correspondent suggesting that public borrowing is at “absolutely eye-wateringly enormous” levels and that with regard to the 60-year high in public debt: “This is the credit card, the national mortgage, everything absolutely maxxed out.”

This was yet another example of the failure to grasp some of the basic issues of fiscal policy – an issue I touched on here. It was particularly interesting to hear these comments on a day when the Economic Statistics Centre of Excellence (ESCoE) issued a report highlighting the general public’s lack of understanding of economic issues. The report, which was based on direct surveys of the public, found that they “could give broad definitions and speak in broad terms about economic concepts. However, when they were asked to provide more detailed explanations, they were generally unable to do so, and had typically never considered factors beyond their ‘personal economy’.”

When it comes to issues of dealing with public debt, this distinction is crucial. A household has a finite life and has to repay its debt over its lifetime but a government has a much longer lifespan (if not infinite, then certainly over many generations). Accordingly there is no rush to repay debt so long as there are institutional borrowers willing to hold it in the form of government bonds. Indeed for all the talk of “paying down debt”, UK debt levels have fallen in only 22 of the last 100 years and the incremental declines have been small relative to those years in which it increased. But between 1945 and 2000 it fell sharply relative to GDP, from 250% to 30%, implying that the costs of carrying the debt fell relative to income. As I noted in this post, the economic conditions for debt solvency imply that so long as the rate of GDP growth exceeds the interest rate paid on debt, the ratio of debt-to-GDP will decline (other things being equal). This means that governments should worry less about paying down the debt than ensuring that the amount of debt relative to GDP can be reduced.

The non-specialist media made great play of the “eye-wateringly enormous” £2.3 trillion debt level. But what does this mean? As it happens, the UK national debt has risen to just over 100% of GDP, which is a 60-year high. However the last time the debt ratio was at similar levels, the outstanding debt was a mere £29.6 billion and when it was at an all-time high of 259% of GDP in 1946 it amounted to £25 billion. To put that into context, the UK recorded a fiscal deficit of £47.9 bn in April 2020 alone: in absolute terms, the April deficit figure was almost twice the annual debt which the UK racked up after the most expensive conflict in its history. Of course such comparisons are meaningless for they take no account of inflation but they illustrate the futility of trying to grasp what a £2.3 trillion figure means. It is perhaps not surprising that voters struggle to understand basic fiscal concepts.

On the basis that the survey indicates they can relate economics to their own experience, consider this thought experiment. Imagine that a household has a gross income of £50,000 and borrows £200,000 to fund a mortgage. To be sure, £200,000 is a tiny fraction of £2.3 trillion but it represents four times the household’s annual income compared to one times the economy’s total income (or three times the government’s annual tax receipts). Who is more indebted? Moreover, the household has to pay back its borrowing over a horizon of 25 years but the government can simply issue more interest-bearing IOUs in order that it can roll over its debt. Who has the more onerous debt repayment schedule? And it is not just the UK consumer who struggles. The German government has convinced its electorate that it also should pay down debt, with the consequence that many German economists bemoan the lack of investment in infrastructure in recent years. 

I have made the point previously that the media has a big role to play in educating the public in the use of economic statistics and holding those politicians who misuse them to account. The most egregious misrepresentation of recent years was the claim by the Leave campaign that the UK could save £350 million per week by leaving the EU. When Boris Johnson repeated this claim in 2017 the head of the UK Statistics Authority called him out. However it was known to be a lie in 2016 before the referendum, and the popular press did nowhere near enough to call it for what it was. The comments by the BBC’s political correspondent, referred to above, were not (I assume) motivated by a deliberate intention to mislead but they nonetheless painted a false picture of an economic problem that affects all taxpayers. In that sense, the media can be said to be acting irresponsibly.

However the economics profession does not get off scot free either. Economists are often not good at explaining economic concepts in ways which relate to the everyday lives of most people. As the ESCoE report points out, “people are deeply interested in the economy and economic issues … However, at the same time, … they felt economics was difficult to engage with properly for the average person, and felt it was communicated it an inaccessible way, describing the economy as ‘confusing’, ‘complicated’ and ‘difficult to understand’”. In a speech given a couple of years ago, BoE chief economist Andy Haldane recounted an anecdote in which an academic tried to explain to an audience why leaving the EU would be bad for UK GDP. To quote Haldane: “A woman rose from the audience and, with finger pointed, uttered the memorable line: ‘That’s your bloody GDP, not ours!’” Indeed the ESCoE report suggested that “GDP was seen as economic jargon, contributing to the feeling that economics was largely inaccessible to them.”

This does suggest that there is a growing divergence between policymakers and the small group who understand the message they are trying to convey, and a sizeable majority of voters who do not. It certainly goes a long way towards explaining why the rational economic arguments against Brexit found such little resonance. The key lesson from all this is that the public needs to be more engaged with those economic issues which affect them in order that they can make more informed decisions. This in turn requires more effort across the spectrum in order to get the message across, and my first wish would be for the media to tone down the hyperbole when discussing matters such as fiscal issues. Economists have a duty to make some of the concepts more accessible as well, otherwise much of what we say will simply go over the heads of those who should be taking notice. On that front, I am more than happy to do my bit.

Friday, 20 November 2020

Union bashing

There has perhaps never been a point in the last 300 years when the union between England and Scotland has looked so strained and there is increased speculation that Boris Johnson will be the prime minister who presides over a breakup of the Union. From differences over Brexit to concerns about how the pandemic is being handled, there are many issues over which governments in London and Edinburgh are not seeing eye-to-eye. Some of these issues may, however, be overdone by an excitable media.

Scottish rumblings about home rule have existed since Victorian times but it was only in the 1970s that it started to make its presence felt in a big way. The discovery of oil in the North Sea off Scottish shores gave impetus to calls for independence and the failure to bring about a Scottish Assembly in a 1979 referendum, due to a wrinkle in the law, was rectified following a second plebiscite in 1997. A further referendum was held in 2014 on whether Scotland should secede from the UK altogether, which failed by a margin of 55% to 45%. But in the last six years much has changed and calls for a second independence referendum are gathering pace. Like all good macro topics, the politics and the economics are intertwined and I take a look at some of the issues here. 

The politics are unfavourable … 

The Scottish National Party, which has formed the government north of the border since 2007, has enshrined the goal of independence in its party creed. But it believes in doing so as part of a consensual process in which the Westminster government collaborates. David Cameron’s government granted the SNP its wish for an independence referendum which took place in September 2014 but failed to get over the line. Having failed in its objective six years ago the SNP’s attempt to reopen the question has so far fallen on deaf ears in London. Indeed, when the UK government sets itself against a second referendum, as Boris Johnson’s has done, it becomes virtually impossible for the Scottish government to achieve its goal of independence in the manner that it would like.

Recent newspaper headlines have focused on the fact that those supporting independence are now ahead in the opinion polls, with a survey released earlier this month giving them an 11 percentage point lead over the unionists. But as chart 1 illustrates it is only in recent months that pro-independence supporters have started to build up a lead. With around 9% of voters apparently undecided, it is far too soon to conclude that the independence camp has built up an unstoppable head of steam. A big contributory factor to recent trends is the increased opposition to Boris Johnson’s government. Ever since the days of Margaret Thatcher, support for the Conservatives in Scotland has wavered between lukewarm and hatred. Given the Scots’ opposition to Brexit and the fact that the prime minister is its public face, Johnson is not exactly Scotland’s favourite Englishman to start with. His handling of the Covid pandemic has made the situation worse, with a recent poll giving him a net approval rating in Scotland of -43 compared to +61 for First Minister Nicola Sturgeon.

As it is, the Conservatives have only six MPs in Scotland which has stoked resentment that they do not have a mandate to take executive decisions north of the border. But the fact that they captured 345 of their 365 seats in England at the last election means that on a nationwide basis it is very difficult to prevent them from assuming overall parliamentary control. Latest comments from Johnson suggesting that Scottish devolution has been a “disaster” have further added to his unpopularity north of the border and fuelled concerns that whilst Scottish nationalists may form a government in Edinburgh, English nationalists do so in London. The upshot of all this is that it is difficult to disentangle dislike of Johnson from genuine support for independence. If the Brexit referendum of 2016 taught us anything, it is that we have to separate opposition to the government from the issue at hand. For this reason a referendum in the near-term may not be a good idea as it risks conflating a number of issues. 

… the economics even less so 

In many ways the economics of an independent Scotland are even less favourable than they were in 2014 at the time of the last referendum. My view at that time was that the pro-independence lobby significantly understated the economic costs of going it alone. Nothing that has happened in the last six years has changed my view. The Scottish government’s assumption was that the revenue from oil resources would provide a significant safety cushion. Without going through all the calculations here, my 2014 analysis concluded that Scotland had 40-50 years of viable reserves. But since the costs of extraction will rise as the most easily accessible reserves are exhausted, a lot of what is in the ground may not be worth recovering, particularly if oil prices remain low. Six years ago the government assumed a long-term oil price of $100/barrel: It is currently trading at just above $40 and has averaged $55 since 2015. The switch away from fossil fuels will likely put further downward pressure on prices, suggesting that Scotland may not be able to reap the benefits of its oil reserves in the way its government hoped.

Such revenue shortfalls will have significant fiscal implications. In fiscal year 2019-20, the UK recorded a fiscal deficit equivalent to 2.5% of GDP whereas Scotland reported a figure of 8.6% (including Scotland’s share of oil revenues – chart 3). Whilst revenues are lower than the UK average, the real problem is that Scottish spending levels are far higher than the national average accounting for 46% of GDP versus a UK average slightly below 40%.

This does not mean that Scotland will necessarily be unable to fund itself but it may not be able to maintain its spending pledges without significantly raising taxes elsewhere. Since an independent Scotland would be a smaller and more open economy than the rest of the UK, it would have to rely more heavily on taxing immobile factors such as property. It is beyond the scope of this post to cover all of the other economic issues but an independent Scotland would also have to figure out how to meet its share of fiscal liabilities accrued whilst it was part of the UK and would also need to decide on its monetary arrangements. Suffice to say there are a whole lot of economic questions which require answers. And whilst many Scots are keen to see their country re-join the EU, their application would face significant opposition from Spain which has no wish to set a precedent by allowing regions which secede from their former country, such as Catalonia, to join the bloc. 

And yet … 

Whilst there are significant economic costs associated with Scottish independence which have not been fully thought through, the idea of controlling one’s own destiny is an increasingly attractive one at a time when many Scottish voters feel they are being ignored by a government whose ambitions do not coincide with theirs. My position thus remains the same as in 2014 which is that an independent Scotland will fare rather better than the unionists believe but will endure considerably more pain than the secessionists are prepared to admit. With Labour leader Keir Starmer refusing to rule out a second referendum, my long-standing prediction that the Scottish position could echo that of Canada where a second referendum on Quebec was held 15 years after the first could yet come about.