Turning claims into money without the related risk and expense

Debenhams, Jaeger, Laura Ashley and TopShop are just a few of the high street names having gone – or being on the brink of going – into administration in the UK following a trading year obliterated by COVID-19. The who’s who of doomed retailers makes for stark reading. And similar stories can be found in many economies around the world. There is no denying that while the pandemic has hit many sectors hard, high street retailers have suffered significantly with the prospect of survival, let alone growth looking bleak.

For all businesses weathering the storm of the pandemic, minimising costs and realising value is more important than ever. If ever there was a time for lateral thinking and alternative solutions, it is now. To put the wider landscape firmly in the picture, the latest Business Impact of Coronavirus Survey (BICS) conducted by the ONS found that 64 percent of the UK’s six million businesses are currently at risk of insolvency, with 43 percent of companies running on fewer than six months of cash reserves. It will come as no surprise that industries such as retail and hospitality are at particular risk.

The survey also revealed that 14 percent of all UK businesses have already halted trading as a result of local lockdown restrictions. It is in this vein that the Business Secretary, Alok Sharma, has, under the Corporate Insolvency and Governance Act 2020, further extended the easing of insolvency rules until March 2021. This legislation governs whether company directors can keep trading if there is no reasonable prospect that the company can avoid insolvency. If there is no such prospect, the Insolvency Act 1986 requires them to cease trading. These rules were originally relaxed in March 2020 to help troubled firms deal with the financial shock of the pandemic and plot a course through the crisis.

 

The commodity of time
Businesses can take advantage of the breathing space this relaxation affords to try to avoid insolvency, which inevitably leads to redundancies and knock-on effects for suppliers and business partners. Thinking of new solutions and exploring alternative steps can buy precious time. One such alternative that has proven successful is to look at unrealised assets within the company that are not typical to their ordinary course of business. This can include a range of options, including one many firms will not have thought of – the pursuit of legal claims against third parties.

The prospect for businesses having to litigate is often unattractive and is usually a distraction from their core operations. Litigation consumes internal resources – management time as well as cash. Disputes can take years to resolve and even then, there are no guarantees of success. But the reality is that any commercial dealing comes with the risk of disputes and when they do arise, they need to be dealt with. Businesses struggling in the current economic climate may still have good claims – some may even be directly related to the pandemic – with commercial partners being found to have wronged a company. It could be an unpaid debt, a breach of contract, a breach of a statutory duty or a claim for negligence against advisors. There may also be historical claims that the company has not previously had the time or resources to pursue.

The key is a mindset shift towards viewing litigation as an asset rather than a drain on resources (some disputes can be worth tens if not hundreds of millions). Identifying, bringing and prevailing on these disputes can change the game completely, and the growing sophistication of the litigation funding market has provided support to companies when it comes to ensuring that value is realised from claims. Before the pandemic, the UK insolvency litigation market had grown by 50 percent over the previous four years to be worth approximately £1.5bn per year, largely assisted by an uptake in third-party funding. One study has estimated that the total value of claims funded through litigation funding is worth around £720m per year, accounting for roughly half of all insolvency claims.

Funders can provide funding for all of the legal costs incurred in bringing a claim on a non-recourse basis, as well as covering the adverse costs risk, should the claim not succeed. Funding can also be provided to meet liquidators’ fees, disbursements and other costs. A successful claim not only helps improve the financial position for the business but for its creditors and investors too. In the current and ongoing climate, that is an important factor to weigh up. HMRC is often on the creditors list, so the wider public interest is a factor as well. But litigation funding in insolvency cases is not the answer to all problems. Not every claim will be suitable for funding and it is incumbent upon professional funders not to support meritless claims – the truth is, experienced funders will not entertain them, which makes their decision-making process a good yardstick for what can be pursued.

While governments have sought to insulate businesses from the risk of insolvency in 2020 and 2021, there is no magic wand. The stark reality remains that many businesses, particularly those in more vulnerable sectors such as retail and hospitality, will take time to recover and until then, will remain in a precarious position. An important part of each business’ steps to improve its solvency should include consideration of the genuine claims it may have against third parties – and how securing funding for those claims may be in its best interests.

Citizenship by Investment programmes prove popular

COVID-19 has impacted and shaped the world we currently find ourselves in. 2020 has been unsettling – a year of uncertainty and immobility. With the weight of government decisions and imposed restrictions felt across the globe, it has forced many US citizens to think about their options. Are we safe? Is the economy secure? What does our future look like? Dual citizenship and Citizenship by Investment programmes have long been in place as a viable option for those wanting to diversify their options. The industry has seen a sharp increase in applications from US citizens this year with companies like CS Global Partners helping people gain back some of their freedom.

Micha Emmett, CEO of global legal advisory firm CS Global Partners, has not been surprised by the recent increase, “The coronavirus has hit us all hard. Whether it is through personal loss of loved ones, loss of income or the general loss of freedoms we took for granted. The way various governments have handled the pandemic has differed across the globe and the impact of these decisions have been felt hard by its citizens. Economies have suffered and the full financial impact on individuals is yet to be felt. It makes sense that people have started to think about their Plan B. What are the options? How can I ensure that my family, and my business, are able to thrive?”

As citizenship experts, CS Global Partners are on the pulse of global trends and are constantly monitoring spikes in applications – both from where these applications are coming and to where they are applying. In light of it being an election year and the undeniable controversy and uncertainty that has arisen from it, it is only natural that, combined with coronavirus uncertainty, it has pushed many Americans to look into dual citizenship.

“From US citizens in particular, we have seen an increase in applications to Citizenship by Investment programmes in the Caribbean which, given its proximity to America, makes sense. However, proximity is not the only reason. St.Kitts and Nevis, for example, is proving a very popular choice. The oldest and most established programme, since its inception over three decades ago, it has stood the test of time. Its economy is stable (managing to keep locals financially afloat during their lockdown), their government works well with its people and the lifestyle is a relative safe haven,” explains Emmett.

As a result of the impact of the pandemic on households in St.Kitts and Nevis, Prime Minister Timothy Harris noted that their established Poverty Alleviation Programme (PAP) – which provides low-income households with $500 per month – had supported close to 1000 individuals who had lost income as a result of COVID-19. In fact, PAP is funded by the St.Kitts and Nevis Citizenship by Investment programme donations which highlights the real impact and difference these investor contributions make to the country.

“The PAP beneficiaries count peaked at about 5,800 as more households lost their breadwinners and applied to the Ministry of Sustainable Development to become enrolled as a recipient of the PAP programme. Up to September 2020, my Government has paid out $23 million for the year so far in Poverty Alleviation stipends to assist the poor and vulnerable in our midst,” explained Prime Minister Harris.

Now with the peak of the pandemic having passed in St.Kitts and Nevis, they have been able to open their borders and, with health and safety protocols in place, been able to return to a relatively normal way of life. Which, in comparison with many countries across the world, puts them at an economic advantage.

US citizens wanting to expand their options can rest assured that they are legally allowed to obtain a second citizenship. It is now considered a sought-after benefit, especially in this currently immobile world, to have easy travel access for business. For some US citizens, who have chosen to live abroad and hold dual nationality, they have also chosen to renounce their US citizenship for tax purposes which is an option available to those who want to emigrate from their homeland.

“Obtaining visas this year has been near impossible. Those with dual citizenship have definitely had an advantage. Citizenship in St.Kitts and Nevis, allows you visa-free access to 160 countries across the globe. You cannot over-emphasise the value that ease of travel has in this current time, with borders continually opening and closing to non-citizens. The pandemic is unlikely to be fully eradicated for some time and so the stability that dual citizenship can offer is really an investment in your, and your family’s, future,” says Paul Singh, Director at CS Global Partners.

Citizenship by Investment programmes are a relatively easy way to gain this much needed dual citizenship. By making a donation to specific programmes in the country you are applying to, or by investing in real estate, you are able to secure your citizenship via a quick and easy process. It only takes around 60 days for the St.Kitts and Nevis Programme and, once you become a citizen you are able to pass that down to your children and grandchildren, securing the safety of your future generations.

“The pandemic has changed the world. It has changed how we work, how we live and how we travel. Dual citizenship is now more coveted than ever and, with many reputable Citizenship by Investment programmes in place, it is an accessible option for many,” concludes Singh.

The St Kitts and Nevis citizenship programme is currently offering a limited-time offer for families of up to four until January 15th, 2021. Applicants need only make an investment of US$150,000 instead of the usual US$195,000.

To learn more about St. Kitts and Nevis’s Limited-Time Offer

Click here

For more information, please contact us: pr@csglobalpartners.comwww.csglobalpartners.com

 

How businesses can reduce their cost-to-serve to survive the recession

Already, the UK has seen thousands of job losses, with more expected while some businesses are even closing stores to cut costs. During a recession, the mentality of many businesses and organisations shifts to survival.

However, although their minds might be on the immediate, their actions should still have the future in mind. Within their supply chains, businesses should always be looking at ways they adapt to meet shifting customer expectations. At the same time, organisations must create the necessary agility to address demand and supply variability, while all the time trying to cut costs. It can be a hard balance to strike. Yet, for many, there is huge room for improvement within their supply chain which can help to reduce their cost-to-serve, creating a more efficient, profitable and, most importantly, customer centric business model in the process.

 

Tough times
The Great Shutdown sent the world into a historic recession – countries from all over have witnessed double digit economic contractions. This means that businesses across the globe will now be looking to cut costs and become leaner to survive the economic turbulence.

This necessity to cut costs is vital as, during a recession, consumers are often more careful with their money. Uncertainty over job security, which is part-and-parcel of a recession, only amplifies this frugality. For businesses, this means they have to do more with less in order to operate at a profit and keep the doors open.

In this instance, businesses often look towards their labour force to cut the wage bill and towards their physical warehouses, stores and offices to reduce overheads. While this may be a quick way to reduce costs, it does result in losing valuable resources. By addressing inefficiencies within the supply chain, this doesn’t have to be the case.

 

A happy customer for less
When it comes to cutting costs, the supply chain is often overlooked for one reason: there is an assumption that it is a fixed cost and that increased sales naturally drive economies of scale. This couldn’t be further from the truth. Businesses looking to become leaner should address their cost-to-serve, defined as the analysis and quantification of all supply chain activities and costs necessary to fulfil customer demand.

This analysis will help businesses to highlight which customers and products are the most and least profitable within their supply chain. With this information, they can make decisions around whether it is worth continuing to provide that product or serve that specific customer and/or what needs to be done to make it more profitable. Often, businesses have the opportunity to make better use of their existing capacity by making smarter decisions. Understanding and deciding which product sells better or for a higher price in which market can ensure maximum profitability. For example, an unbranded product is likely to sell better in the UK than in France. With this knowledge, businesses can optimise their supply chain to ensure their products are going to the places where both demand and profit are highest.

In terms of cutting costs, one area within the supply chain which can lead to a high cost-to-serve and lower profit margins is transportation. Businesses looking to minimise this can evaluate their delivery routes to make them as efficient as possible. This can be optimised from the first to the last mile of the supply chain, accounting for both the raw materials needed to produce the product and delivery of the product itself. This also has the added bonus of enabling businesses to become more sustainable. As well as maximising profit, improving transportation efficiency will create a more carbon-neutral supply chain, helping businesses reach their goal of becoming net zero between 2030-2050.

There are, of course, other ways in which businesses can reduce their cost-to-serve, which is why it’s such an efficient way to cut costs. The manufacturing process is another area of focus. There is often huge scope for automation in this area, which can significantly reduce the cost of production and lead to greater productivity. However, businesses must be able to identify these pain points if they are to address them. Technologies that apply AI and machine learning could help them do exactly this.

 

A helping hand from technology
Optimising the cost-to-serve is achieved by analysing and assessing all the supply chain activities in the network. From this point, fixed and variable costs can be allocated according to each of these activities, allowing businesses to address the areas which can be improved. While many companies have the ability to do this at an aggregate level, eg, a product category, few have a truly granular view of the costs associated with individual SKUs or customers.

Understanding detailed cost-to-serve, by customer and by product, is a pre-requisite for any organisation looking to manage its profitability. Reducing organisational costs and improving delivery efficiency as well as ‘right sizing’ the service offering for a particular product, customer or channel is an imperative. Organisations can model routes to market and service strategies based on this segmentation, and understand the impact on their individual cost-to-serve

The ultimate goal is to increase company profitability by making unprofitable customers profitable or helping profitable customers become even more profitable. But, without accurate cost-to-serve data for current customers and products, this is a near-impossible task.

For this to be a reality, organisations need an end-to-end view of their supply chain. Using digital twin technology businesses can digitally replicate their supply chain, analysing and optimising its efficiency in the process.

 

Finding a solution
Running this analysis, businesses will be able to quickly identify which products and customers are less profitable than others and the reasons for this. With this information on hand, they can more readily address the problem, whether it is an inefficient delivery process or excessive manufacturing costs.

Using the same technology, businesses can then work towards a solution, which will help them become more efficient. The digital twin allows them to test-drive alternative strategies, before committing to them in the real world. This way, they can continue to try different options, at no cost, until they find the most efficient solution.

To achieve maximum efficiency, businesses can then look to forecasting technology, which will help them to predict oscillations in demand, even during the heightened disruptions caused by COVID-19. With recessions often causing a downturn in demand, it’s important that businesses are not creating excess supply, which will cost them money without a return on their investment. Forecasting technology will help them avoid this pitfall.

In today’s never normal world, whether it is recessions, pandemics or geopolitical tensions such as trade wars, businesses that want to stand the test of time must be prepared for all eventualities. While it may represent an initial cost, implementing technology which can consistently optimise your cost-to-serve will ensure that, no matter the economic climate, your business has the best chance of maximising its profit. That should be the benchmark of any successful business.

Securitisation – the antidote for non-performing loans

Without doubt, the collateral fallout from COVID-19 will herald in a new era for the global non-performing loan (NPL) market, as not only will there be the inevitable surge in NPL volumes precipitated by COVID-19’s impact on the economy, but these new volumes will be accretive to the current NPL stock that is residing in the banks as a hangover from the global financial crisis (GFC).

Indeed, as the banks commence the unenviable task of picking through their loan book and identifying those NPLs that they must offload, they will also be cognisant of how they do this in a highly efficient manner that maximises returns. In terms of process, although the prime candidate for this will be the hugely successful competitive auction processes that have become an intrinsic part of the NPL market, in practice we are likely to witness securitisation step up to the plate and assume a critical role in alleviating the pain of the banks.

Conceptually, the application of securitisation technology is the perfect medicine for the cleansing of bank balance sheets. In essence, these structures involve a bank selling a portfolio of NPLs to a special purpose vehicle that funds such an acquisition by issuing debt securities into the capital markets. The vehicle will in turn appoint a servicing entity that will manage the underlying loans on a daily basis with a fee structure that incentivises them to maximise recoveries.

The use of securitisation makes a lot of sense. This technology has the capacity to enable a significant volume of NPLs to be removed from the banks in one fell swoop. Given the only limitation in sizing a transaction is the magnitude of the universe of investors that can competitively price and absorb an issuance, then we could be talking about pretty hefty deals. The opportunity afforded by securitisation, of offloading NPLs in either one large deal or a series of large transactions, is infinitely more appealing than the alternate scenario of a protracted period of auction processes, that we have witnessed to date.

Securitisation technology also counteracts one of the major stumbling blocks that has traditionally made banks reticent about off-loading NPLs: the pricing. Although NPL securitisation cannot guarantee decent pricing, it does possess a number of features that load the dice in favour of the banks when it comes to trying to achieve the best possible return.

Given the bounty of benefits, it is hard to see why securitisation cannot play an instrumental role in mopping up the balance sheets of banks. Indeed this is not a new concept and there is precedent for this in the United States, in the late 1980s, when securitisation technology played a key role in enabling the Resolution Trust Corporation to liquidate assets once owned by the savings and loans associations.

Similarly, had securitisation not been perceived as one of the main assailants of the GFC, then without doubt it would have been the perfect candidate to clean up NPLs in the wake of the GFC.

Ten years on, it can now be said that securitisation is a very different beast. Through the actions of investors, regulators and market participants, securitisation structures have now been finessed and structural shortcomings fixed. Furthermore, the recent Securitisation Regulation has encouraged and incentivised securitisation structures to be simple, transparent and standardised.

In summation, given the hugely positive attributes of an NPL securitisation when coupled with the fact that this technology is now ‘fit for purpose’, then the requisite fertile conditions currently exist for these structures to be deployed at scale to offload NPLs. Indeed, the fact that the governments of Italy and Greece in recent years turned to securitisation for “GACS” (“Garanzia Cartolarizzazione Sofferenze”) and “HAPS” (“Hellenic Asset Protection Scheme”) respectively, could in itself be construed as a massive endorsement of the role that this technology can play.

Ultimately, since these structures efficiently enable incredible volumes of NPLs to be distilled from the banks, which in turn enables banks to eradicate their NPL issue on a more timely basis, then securitisation should truly be considered the NPL antidote. Banks choose not to embrace this at their peril.

What would a Biden term mean for oil?

Joe Biden has emerged with a commanding lead in the US polls and his presidency could have far reaching consequences, not just for the oil and gas industry, but for the energy sector as a whole. Not to be understated, Biden’s proposals for the sector are set to bring about the most significant changes to the US offshore industry in its history.

If the election were held today, polls suggest Biden would beat the incumbent Republican president Donald Trump in key battleground states. On energy policy, the two candidates appear worlds apart. Trump, who proclaimed an era of American “energy dominance” in 2016 has since taken the US out of the Paris Climate accord. On the domestic front, he has been equally aggressive in deregulating the environment and energy sector, pursing the rollback of hundreds of rules in areas such as fracking and methane. Fifteen states are also now suing the Trump administration for opening Alaska’s Coastal Plain up to oil and gas leasing in 2017, which they regard as a violation of environmental laws. The difficulty for Alaska is that nearly 85 percent of the state budget is dependent on oil revenues.

In contrast, Biden has earmarked $2tn in green energy spending for his first term. The investments dovetail with his economic plan to create jobs in manufacturing “green energy” products and focus on climate policy to drag the economy out of its pandemic-era recession. Some have suggested this will mark the beginning of an ‘offensive’ on the fossil fuels industry. In normal times, a move like this would be viewed as self-destructive during an American election, with so many states dependent on the oil industry for jobs. Yet, these are not normal times and it is made possible by a number of factors. The devastating pandemic has delivered a costly blow to the economy, the oil and gas industry, and the viability of new projects. A Green New Deal could be appealing as a stimulus package at a time when clean-energy costs are falling drastically and technological progress is advancing very quickly.

It puts the energy sector at the centre of the election. Critics argue that Biden’s plans lack clarity, as he is all too aware of the political gambles involved with climate change including the possible local industry losses and threat to jobs. Upstream producers have also spent billions on exploration and development projects, the value of these ventures could be eradicated, and litigation is likely. Despite these concerns, Biden and the Democrats have been quick to point out these climate policies can be a source of job creation, during a period when the US is looking to recover from the impact of COVID-19.

Pledges from the Biden campaign include a target of net-zero emissions by 2050 and to “decarbonise” the US electricity sector by 2035. This would involve installing a vast network of charging points for new cars and electrifying the US’s transportation sector. Crucially, deploying utility-scale battery storage across the US would enables power system operators and utilities to store energy for later use. This would be a key component to maximise the benefits of the installation of thousands of wind turbines, millions of solar panels, and the plans to double offshore wind capacity by 2030.

While worries in the oil and gas industry are growing and the climate debate becomes increasingly polarized from the two candidates, Biden’s plans include a policy that may alleviate some industry concerns. This involves the extremely expensive process of carbon capture and storage which is currently attracting huge amounts of climate-related investment. These machines serve a single and simple purpose: to remove carbon dioxide from the air through direct air capture. Biden’s camp has committed to research in the technology, not only as a mitigation tool for climate change, but because carbon capture may prove a useful campaigning tool. It would guarantee the oil and gas industry’s role in the US and help Biden avoid more tricky questions on fracking, from regions where Trump may easily collect votes. As things stand, Biden’s plans do not include any bans on industries like fracking, but he need not worry if they frighten off investors.

It may not come down to voters. Even if energy policy is overlooked, investors have a watchful eye on the White House and will be aware of the changing tides. Individual states have also been driving their own agendas for some time. California, for example, is the US leader in solar power, with 18% of its electricity generated from solar in 2019. Whatever the outcome of this election, greater consensus driven by corporates, investors, campaigners, and the wider population may result in stronger action, even when the policy prescriptions are still heavily debated.

This US election offers two extremely contrasting views on the future of energy, and the eyes of the world will be certainly watching.

 

Are Presidential campaigns worth the vast expense?

Campaigning to be elected president of the US is an expensive undertaking. During the 2012 presidential race, Barack Obama and Mitt Romney spent a combined sum of nearly $1.12bn, according to the Centre for Responsive Politics.

Although US elections have almost always been a costly affair, the costs have only spiralled over time. Between Abraham Lincoln’s 1860 campaign and Donald Trump’s in 2016, the amount spent to be elected president increased more than 250-fold, even when the numbers are adjusted for inflation.

There’s an obvious reason why candidates feel driven to outspend their opponents. Throughout history, the majority of winning presidential candidates have been those who spent the most on the campaign trail. Throwing money at the election, therefore, seems like the logical conclusion. As Mark Hanna, a US Senator, once said, “There are two things that are important in politics. One is money and I can’t remember what the other one is.”

But just because there is a correlation between campaign spending and winning the election, that doesn’t mean that money is the deciding factor. In fact, there is an ongoing debate among political scientists over whether campaign spending meaningfully affects election outcomes at all.

 

Money well spent
In September, it was reported that Democratic presidential nominee Joe Biden was vastly outspending Trump on the campaign trail. According to campaign officials, as they enter the final stretch of the race, Biden has $141m more left in the bank than his rival. If money determined the election result, then this suggests Biden could be on the path to victory.

But while being the bigger spender increases a presidential candidate’s chance of winning, it’s no guarantee of success. There are plenty of examples of electoral candidates who have spent big and failed to win the vote.

In some cases, a higher amount may be spent to compensate for other problems. As Brian Libgober, political scientist and Assistant Professor at the University of California at San Diego, explains, sometimes the candidates with the most cash are self-funded. The fact that they have to rely on their own money in lieu of donations can reflect their own weaknesses as candidates. “Often, these are particularly wealthy self-financed candidates who can raise funds without necessarily having the qualities that make a candidate electorally strong, for example relationships with key constituencies, experience running for office, charisma, a compelling policy platform and so forth,” he said.

Michael Bloomberg, who ran for president in 2020, is one such example. Despite pouring almost $1bn of his own money into his three-month long campaign, Bloomberg was forced to end his presidential bid after securing just one endorsement from the 15 up for grabs on a critical night in the Democratic primaries. His poor performance on the debate stage was a key reason why.

Another high-profile example that proves money doesn’t guarantee electoral success is President Donald Trump’s 2016 victory. The $398m he spent on campaigning was almost half the amount forked out by his opponent Hilary Clinton. As Libgober points out, clearly Trump would not have won “if campaign spending was decisive in presidential elections.”

Of course, Trump benefitted from a number of advantages his rival didn’t have: TV stardom and an anti-institution persona, with none of the baggage that accompanies a political career (or being married to a former president). But there are a number of other reasons why Trump succeeded despite being at a disadvantage financially.

‘Earned media’, or free press, is one. Trump had a huge advantage in the daily news cycle thanks to his controversial remarks. A study by The New York Times found that, overall, Trump enjoyed nearly $2bn worth of free media coverage during the campaign.

Others suggest that Trump’s campaign spending was more effective than Clinton’s. He invested more heavily in social media, whereas Clinton relied on more traditional advertising, like expensive TV ads. Also, a study by the Wesleyan Media Project found that most of Trump’s TV ads attacked Clinton’s policies, whereas most of Clinton’s went after his personality, which may have weakened her case to voters.

Libgober, however, doubts these factors would have made much of an impact. “I tend to view these claims with scepticism,” he said. “In a media environment already super-saturated with information and where partisan loyalties are strongly activated, we should not expect campaign spending to make a huge difference.”

Advertising is the cornerstone of any election campaign, usually making up the bulk of the budget. President Obama spent more than 70 percent of his campaign expenses on advertising. But, despite the huge sums put towards it, the effectiveness of political advertising is far from clear-cut.

 

Getting noticed
The most obvious benefit of a media campaign is name recognition. Unsurprisingly, many studies have found that people prefer candidates they recognise over those they aren’t familiar with. It goes without saying an enormous presidential campaigns tends to result in widespread recognition of the candidate.

Achieving this recognition is particularly important for newcomers. Unlike the incumbent, these challengers aren’t yet household names. Therefore it’s worth raising large funds in order to level the playing field. In this scenario, money can have a significant impact on the race, since it can determine which candidates fall down at the first hurdle.

Spending money early on in the race has also been shown to make a difference. A 2016 study found that early spending impacted who would win the primaries partly because it boosted the profiles of lesser-known candidates.

Raising public awareness is one thing. Actually persuading people to vote for a candidate is another. This is where the effects of advertising become less clear.

Some experiments have found political advertising to have only a negligible impact. One large field experiment measured the effect of TV advertising during Rick Perry’s 2006 campaign to be elected Governor of Texas. The results were surprising. Although Perry gained a 5 percent lead in the polls in the markets where the ads were played, this lasted only a week.

Potentially, that’s because a lot of voters had already made up their minds about Perry. Over the years, political partisanship in the US has increased. According to the Pew Research Centre, the overall share of US citizens who consistently stick with their political views more than doubled between 1994 and 2014 from 10 percent to 21 percent. The prevalence of “ideological silos” makes it less likely that voters will change their mind because of an advertising campaign.

“Persuading through advertisement is harder, particularly if the customer has strong brand loyalty or already knows the product. By analogy, campaign spending is least likely to help during elections where there is already substantial media coverage and hardened public perceptions. Since the US presidential election is the most extensively covered election where party attachments are the strongest, it is exactly the kind of election where we should expect campaign spending to matter least,” said Libgober.

If advertising’s ability to persuade is so fraught, then it may be that large portions of campaigning budgets are being misspent. Indeed, some suggest that elections are a case study in diminishing returns. The closer the race becomes, the more donors are prepared to spend and the less impact their money has.

 

Reading the dollar signs
Money may not dictate who wins the election. But it can tell us something about who is most likely to win.

Sometimes the number of donations indicates which candidate the wider population believes is strongest. “In elections at all levels, strong candidates typically attract more funding than weak candidates, so it is almost inevitable that there appears to be a relationship between campaign spending and winning,” said Libgober. “So maybe it isn’t that the candidate that spends the most typically wins because they spend the most, but rather that the candidate with the greatest electoral strengths tends to win and also spends the most.”

We can also get a sense of the probable winner by looking at how many small-donors have contributed to their campaign. Small-donors are very likely to vote, so the number that a campaign attracts is a useful gauge for a candidate’s popularity with voters. According to the Centre for Responsive Politics, Trump has raised almost $100m more than Biden from such voters.

So following the money can give us an indication of which way voters are leaning. As for whether the amount spent in the campaign will actually affect the election result, Libgober is doubtful.

“Of course, in a razor-thin election such as 2000 or 2016, even small differences can prove pivotal. At this point the polling does not suggest a razor-thin election outcome [for Trump versus Biden], although that could change. I suspect that the performance of the economy and the stock market, the direction of the pandemic, and issues around ballot access are more likely to matter than anything the campaigns do themselves,” said Libgober.

All things considered, campaign spending appears to be most effective when a candidate needs to improve their name recognition. After this, the impact of all that advertising spending becomes much harder to evaluate.

This raises questions that have a bearing not just on campaign officials, but for wider society as well. For example, if the effect that advertising spending can have on political outcomes is limited, then this may cast doubt on the influence of digital advertising behemoths like Facebook, which has come under such intense scrutiny for their relatively unregulated approach to hosting political ads.

Even so, there’s no denying that money can distort the political process. It’s important therefore that the public knows where campaign finances are coming from. Ever since Citizens United – a 2010 case in which the Supreme Court decided that campaigns could receive unlimited amounts from companies and individuals – many have been concerned about the level of influence that rich individuals and corporations can have on the election’s outcome. Alarmingly, ‘dark money’ – undisclosed donations – has been on the rise. While the effectiveness of a lot of campaign spending is unclear, more regulation and transparency around where these funds come from would surely benefit the political process.

Top 5 countries to be world’s next manufacturing hubs

There’s a reason China has been named “the world’s factory”. According to data published by the United Nations Statistics Division, China accounted for almost 30 percent of global manufacturing output in 2018. China earned this status in a relatively short space of time. According to The Economist, in 1990, China produced less than 3 percent of global manufacturing output. It first overtook the US, previously the world’s manufacturing superpower, in 2010.

But the US-China trade war has prompted many companies to re-examine global supply chains. A recent study by the McKinsey Global Institute estimates that companies could shift a quarter of their global product sourcing to new countries in the next five years. Climate risks, cyber attacks and the ongoing pandemic are only accelerating this trend. In this uncertain trade environment, a growing number of countries are hopeful that they could replace China as the world’s next major manufacturing hub.

 

1 – Vietnam
So far, Vietnam has been one of the main beneficiaries of the US-China trade war, absorbing much of the manufacturing capacity that China lost. As well as cheap labour and stable politics, the country boasts increasingly liberalised trade and investment policies that make it an attractive place for businesses looking to diversify out of China. Some of the biggest names in tech have relocated some of their operations to Vietnam since tensions between the two powers soured. In early May 2020, Apple announced it would produce roughly 30 percent of its AirPods for the second quarter in Vietnam instead of China.

 

2 – Mexico
A lesser-known beneficiary of the trade war is Mexico. In a report, the investment bank Nomura pointed out that Mexico could become a top destination for US companies, with the country having set up six new factories in a range of sectors between April 2018 and August 2019. In addition, Taiwan-based manufacturers Foxconn and Pegatron, known as contractors for Apple, are among a number of companies currently considering shifting their operations to Mexico. Mexico’s proximity to the US poses a major advantage as US companies embrace “near-shoring”. The Trump administration is exploring financial incentives to encourage firms to move production facilities from Asia to the US, Latin America and the Caribbean.

 

3 – India
In recent years, India has significantly stepped up efforts to attract manufacturing investments into the country. Prime Minister Narendra Modi’s “Made in India” initiative is designed to help the country replace China as a global manufacturing hub. A cornerstone of this plan involves encouraging the world’s biggest smartphone brands to make their products in India. In June of this year, the country launched a $6.6bn incentive programme to boost electronics manufacturing production in the country. So far however, the country has seen only modest gains from the trade war. Analysts blame India’s stringent regulatory environment; on the Organisation for Economic Development’s FDI Regulatory Restrictiveness Index, India ranks 62nd out of 70 countries.

 

4 – Malaysia
Between 2018 and 2019, the Malaysian island of Penang saw a surge in foreign investment. Much of this came from the US, which spent $5.9bn in Malaysia in the first nine months of 2019, up from $889m the year before, according to the Malaysian Investment Development Authority. US chip maker Micron Technology announced it would spend RM1.5bn ($364.5m) over five years on a new drive assembly and test facility. However, the loss of trade from China has hit Malaysia hard. Many tech firms in Penang rely on China for as much as 60 percent of their components and materials.

 

5 – Singapore
Singapore’s manufacturing prowess has somewhat depleted in recent years. While manufacturing contributes about 30 percent of the GDP of Taiwan and South Korea, it makes up just 19 percent of Singapore’s. However, the trade war and the coronavirus pandemic could change this. As a trade hub with liberal trade and investment policies and a history of stable economic growth, Singapore is well-positioned to boost its manufacturing capabilities and capitalise on this opportunity. However, like Malaysia, Singapore is also struggling with the knock-on effects of decreased demand from China. The export-dependent country has seen its manufacturing output slump as a result of the trade war – a sign that the country could benefit from greater independence from China.

How culture can help explain economic development

In the middle of the eighteenth century, Europe experienced explosive economic growth. GDP per capita in the Netherlands – one of the richest parts of Europe at the time – was 42 percent higher than in the Yangzi delta, then the economic powerhouse of China. By 1770, that figure had reached 90 percent. In just a few decades, Europe wealth had rapidly surpassed that of all other regions.

The Great Divergence, as it is called, helped spawn the discipline of economics as we know it today. Adam Smith’s landmark 1776 text The Wealth of Nations sought to identify the major contributors to a nation’s wealth and sparked a long line of economic inquiry analysing how culture dictates which countries become wealthy and which do not. Many of these analyses concluded that European culture alone was conducive to economic growth; the German political economist Max Weber argued that the Protestant work ethic was responsible for Europe’s high economic output.

In the twentieth century, cultural explanations for wealth inequality between nations began to lose their popularity with economists. There were two main reasons for this. One was the rise of the ‘Asian tiger economies’, which refuted the idea that only Western, Christian cultures could enjoy great economic success. The other was the growing prevalence of data, which gave rise to more quantitative theories for the explanations of markets as well as economic explanations of sociology.

“In the 1960s to the 1970s, mainstream economists began to argue that economics could provide explanations for many phenomena in social sciences,” said Paola Sapeinza, the Professor of Consumer Finance at Northwestern University’s Kellogg School of Management. “The paradigm became that economics affects culture, not the other way around. For example, basic economics was used to explain family decisions, such as the participation of women in the workforce and fertility choices, ignoring cultural influence.”

Now, the dial has swung back once again. Today’s economists are turning to culture to answer questions about people’s financial behaviours and what shapes them.

 

A different perspective
Towards the end of the twentieth century, economists began to see the pitfalls of imposing economic policies without paying heed to culture. The Washington Consensus, for example – a set of neoliberal policies presented to the International Monetary Fund in 1989 – is broadly seen as having failed to achieve its goal of bringing prosperity to Latin America. In the thirty years after the Washington Consensus was implemented, Latin America grew less than 1 percent per year per capita terms, compared to 2.6 percent annual growth between 1960 and 1981.

This shows that, while a certain kind of institutional reform may succeed in one country, it won’t necessarily succeed in another, and culture may be the reason why. “One example is Italy,” said Thierry Verdier, Professor of Economics at Paris School of Economics, “where you had reforms that worked in the north but not in the south. Why? People have begun to think that it’s down to very long-term factors such as the development of cities in the north of Italy and the building up of social capital there that happened over a long period of time. It didn’t exist in the south because of other historical developments.”

There are many scenarios where economics alone cannot account for the behaviour of a certain group. For example, immigrants and their children often exhibit different behaviour despite being in the same economic environment as other citizens.

“Immigrant children of a certain origin systematically outperform US-born students in the country, even if they attend the same school. These differences hold after taking into account the income and the education of the parent,” said Sapienza. “If the explanation for these differences in behaviour were economic conditions or the quality of the institution, we would not observe these differences.”

By identifying the cultural beliefs that proliferate in more productive and innovative countries, we could advance our understanding of the conditions needed for economic success. “We understand that if we invest more in physical capital or in finance or in technology the economy probably will grow more. But that doesn’t necessarily explain the variety of growth across the world,” said Verdier. “To explain that, we need to go to deeper causes which relate to how a country developed.”

 

The traits of successful countries
Economists have linked some cultural beliefs to higher levels of economic development. One of these, inevitably, is a population’s willingness to engage in markets, whether through investment or employment. “The decision to work has economic consequences for the individual and the family but more generally for the development of the nations,” said Sapienza, “as productivity is positively affected by the share of labour participation in the economy.”

To assess this willingness to participate in markets, economists will sometimes look at the prevalence of social trust in a given community. Many studies have associated increased social trust with higher rates of trade, innovation and development in a country’s financial sector. Countries that record low levels of trust between strangers, meanwhile, tend to be less economically developed. Of course, institutions have a role to play here as well. If a country’s economic institutions are less transparent and less reliable, it follows that people would be less likely to trust them with their capital.

Studies have also revealed a correlation between the strength of family ties in a country and that nation’s economic development. Stronger family ties usually means more family businesses. As family businesses are often less competitive and less efficient than other firms, their prevalence can have a negative impact on the economy.

A look into the past could explain why some of these traits develop in the first place. A 2019 paper by Benjamin Enke, Assistant Professor at Harvard University’s Department of Economics, proposes that pre-industrial groups with a higher occurrence of pathogens in their environment were more likely to forge close-knit family ties, because shunning outsiders was vital for reducing the risk of infection. Even as a culture evolves, deep-rooted factors like this may continue to play a role.

However, it can be hard to determine whether the level of economic development in a country is mainly down to culture or policy. In the case of the Soviet Union, low productivity wasn’t the result of a cultural trait but rather the collectivist regime that had been imposed on the population. Clearly, the economic environment itself has serious implications for a population’s financial preferences and social mobility. The same is true of a country’s physical environment. For example, a study published in the Journal of Human Development has found that landlocked countries are generally at a greater economic disadvantage.

 

Culture clash
Studying culture’s impact on economics is not without its complications. As the economic historian David Landes points out, one problem with discussing the pitfalls of a certain culture is that it could lead to xenophobic interpretations.

Another issue is that culture itself is difficult to define. The vagueness and breadth of the concept makes it hard to draw clear conclusion about its influence on economics. “It’s not necessarily only based on objective measures,” said Verdier. “There’s a degree to which it is very much subjective and how we measure subjectivity is an issue for economists.”

Attempts have been made to create a more precise definition for use by economists. In 2006, Sapienza and her co-authors Luigi Zingales and Luigi Guiso described culture as the “customary beliefs and values that ethnic, religious, and social groups transmit fairly unchanged from generation to generation”.

What’s more, over time, better techniques and more data have been made it easier to qualitatively measure cultural traits. The World Values Survey and the General Social Survey were introduced in the 1980s to evaluate people’s values and beliefs and how these change over time.

“Once you have that information,” said Verdier, “you can relate it to information which is much less subjective, such as growth rates or poverty rates or the fact that particular countries implement regulations on labour markets this way and some do it in another way. So economists take the subjective information from these surveys and relate it to the more objective economic indicators that are more systematically collected in a very well-defined manner from the start.”

 

The implications for policy
When taking culture into account, it’s important to consider the way it interacts with other factors that impact economics. Verdier believes that the complex interplay between culture and institutions is crucial for understanding why and how countries develop in different ways.

“There’s one aspect that is often debated among economists,” he told World Finance, “which is whether or not the interactions between institutions and culture are complementary. Certain types of formal rules are complementary to the development of the maintenance of particular beliefs. Say, for instance, that you have a discriminatory market institution like slavery. That certainly interacts in a complementary way with the beliefs of racism. And so that’s a case where the racist culture you have is complimentary to the types of institutions in the country and they reinforce each other.”

But the opposite can also be true. “You may have a situation where, on the contrary, institutions and culture tend to mitigate each other in terms of their effects,” said Verdier. “For instance, a country’s population could have a strong belief in the value of work and, at the same time, welfare programmes that are maybe too generous or just distributed without any conditions. And that could generate a notion that you have rights and those rights actually depreciate the value of work, which in turn creates, of course, inefficiency in terms of the social welfare system.”

Understanding how culture and institutions reinforce or counterbalance one another can have real-world applications for the way we implement policy. Before reforming an institution, it’s important to know whether an existing cultural belief or value could potentially undermine it. “In that sense,” said Verdier, “having some knowledge coming from sociologists to economists that do this kind of work from a quantitative perspective, may provide some insight on whether or not you have a framework in place of institutional reforms to make them more effective.”

For decades, economists turned their nose up at cultural explanations for economic outcomes. The economist Robert Solow said that attempts to meld the two subjects ended up in “a blaze of amateur sociology”. But no market is created in a vacuum. Today, economists are increasingly willing to recognise that the wealth of a given nation cannot be explained without acknowledging the complex interplay between many different factors, from its institutions to its cultural beliefs to its environment and its pre-modern history. “As wonderful as a tool economics is, it does not explain all behaviours. Incorporating culture among the explanations has made economics a much more powerful tool,” said Sapienza.

EU chief reveals plans for post-pandemic recovery

On 16 September, the president of the European Commission Ursula von der Leyen outlined plans to bring the bloc out of the deepest recession in its history while also making European nations more resilient in the future.

Von der Leyen is determined that the EU emerges from the coronavirus pandemic better prepared to face another looming crisis: climate change. Addressing EU lawmakers in her first State of the Union address, she reinforced her commitment to cutting greenhouse gas emissions, announcing that the EU should increase its emissions-cutting target to at least 55 percent by 2030, up from an existing target of 40 percent.

“I recognise that this increase from 40 to 55 is too much for some, and not enough for others,” she said. “But our impact assessment clearly shows that our economy and industry can manage this.”

She also stressed the importance of international cooperation during the pandemic, urging European governments to work together on common healthcare policies. She promised that the EU would create a biomedical research agency and convene at a global health summit. “We need to build a stronger European Health Union,” she said. “And we need to strengthen our crisis preparedness and management of cross-border health threats.”

Von der Leyen’s calls for cooperation stand in contrast with the union’s apparent fragility at this time. Her State of the Union address comes amid growing antagonism between the EU and the UK over a Brexit deal. The European Commission President warned that hopes of a post-Brexit trade deal were “fading” and that UK Prime Minister Boris Johnson’s attempt to override parts of the withdrawal treaty was illegal.

Universal basic income gains support during the pandemic

In his 1516 fictional work, Utopia, the philosopher Thomas More describes a conversation between Portuguese traveller Raphael Nonsenso and the Archbishop of Canterbury, John Morton, in which the former argues that cash handouts provided by the state could reduce theft in the city of Antwerp. “No penalty on earth will stop people from stealing, if it is their only way of getting food,” Nonsenso says. “It would be far more to the point to provide everyone with somne [sic] means of livelihood.”

This is thought to be the earliest written example of a concept that’s still considered radical today: universal basic income. Its advocates argue that the state should provide any individual with regular income, regardless of other factors such as their employment status, personal wealth or ability to work.

The concept of universal basic income has typically been popular among left-leaning economists and academics. Until recently, it had only been trialled in a number of small-scale pilots, including one in Finland and another in the Canadian province of Manitoba. Usually relegated to academic discussions, it was not a concept at the forefront of politicians’ minds.

But the coronavirus pandemic could change this. The crisis has led to unprecedented levels of government spending, with European nations guaranteeing wages through furlough schemes while other countries increase welfare provisions. More and more policymakers and politicians now believe that enacting unconditional universal basic income could mitigate some of the worst effects of the pandemic.

 

Desperate times, desperate measures
In March 2020, over 500 academics and public figures from around the globe signed an open letter urging governments to enact emergency basic income during the pandemic. Jens Lerche, Reader in Agrarian and Labour Studies at SOAS University of London, was one of its signatories. He explained his reasons for signing to World Finance.

“Across the world, millions of people have lost jobs and livelihoods because of the pandemic. The relief packages put in place only provide cover for some of them. Millions must rely on charity or survive on bare minimum support,” he said. “The only simple and straightforward system that could carry everyone through the crisis is universal basic income. It could ensure that no one fell through the cracks.”

Its proponents argue that an emergency basic income could provide vital support to small businesses and the self-employed, many of whom have been neglected in governments’ financial stimulus packages during the pandemic. Some commentators have also argued that universal basic income could limit the virus’s spread by increasing social distancing. They reason that, without the need to work, less vulnerable people would put themselves at risk of contracting the disease.

Some countries are already taking the idea more seriously. On 15 June 2020, Spain – one of the hardest-hit countries at the start of the pandemic – offered monthly payments of €1,015 ($1,145) to the nation’s poorest families. Germany announced in August that it was trialling such a system in a three-year study, giving monthly payments of $1,400 (€1,200) to 120 Germans and aiming to compare the results with 1,380 people who do not receive the payments.

 

Post-pandemic legacy
There are many, however, who believe universal basic income should be much more than an emergency response to the pandemic. These people would like to see universal basic income become integrated into economic systems around the world.

A number of arguments have been put forward in favour of this. One is that universal basic income could heal some of the wealth inequality created through capitalism. “As the share of the wealth generated in society for the last decades has disproportionally benefited the super-rich, while many wages have stagnated, universal basic income is also a way of redressing skewed wealth distributions,” said Lerche. “In the long run, it could lead to less divided societies.”

Liz Fouksman, Leverhulme Early Career Fellow in Area Studies at the University of Oxford, argues that, nowadays, people tend to think of basic income as a social policy or as a welfare provision. But, at its heart, the idea is about the redistribution of wealth. “From its earliest conception,” she said, “basic income was seen as a way of getting people their rightful share – whether it’s their rightful share of land wealth or their rightful share of the wealth created by previous generations or their rightful share of natural resource wealth. So its roots very much lie in the realm of distributary justice. And that’s something that often gets lost in the basic income conversation, especially when it comes to conversations around policy and implementation.”

Some also believe that, with the rise of automation and artificial intelligence, universal basic income could become a necessity. “The need for people to have a basic income that is linked not only to work is likely to become even more important in future as the progress in automation will lead to a shrinking job market,” said Lerche. In fact, the economist John Maynard Keynes predicted in 1930 that the grandchildren of his generation would work only fifteen hours a week, because their material needs would be satisfied. The fifteen-hour work week hasn’t yet materialised, but many academics continue to predict the decline of the global labour force.

Some economists argue that universal basic income could even benefit the economy. A 2017 study by the left-leaning Roosevelt Institute found that giving every adult in the US $1,000 a month could grow the economy by $2.5trn by 2025. Finally, it could also significantly improve people’s lives, by providing people with more income to spend on diet and by potentially eliminating homelessness.

 

Barriers to adoption
Despite its potential benefits, there are a number of reasons why governments have been reluctant to embrace universal income programmes. Unsurprisingly, a big one is cost. But proponents argue that universal basic income is much more affordable than people think.

In a working paper for the United Nations Development Programme, Eduardo Ortiz-Juarez and George Gray Molina estimate that the cost of providing temporary basic income to all people living below the poverty line could cost between $200bn and $465bn per month depending on the specific policy. Considering this could keep 2.78 billion people out of poverty, it’s a relatively modest amount. Meanwhile, the economist Karl Widerquist has found that, to fund a UBI of $12,000 per adult and $6,000 per child annually, the US would have to would raise an additional $539bn a year – much less than the trillions of dollars usually forecast.

“The cost issue is easily exaggerated but clearly more work and more political attention on this aspect is required; the fact that a universal basic income is affordable must be shown again and again,” said Lerche.

The feasibility of a basic income programme also depends on the existing institutional framework of a given country. Arguably, European welfare frameworks are better suited to a universal basic income than the welfare system of the US.

However, it’s not just a question of cost, existing institutions or political willpower. Some of the barriers to adoption are psychological in nature. “I think the biggest barrier is a kind of generalised sense that people can’t get money for nothing,” said Fouksman.

Fouksman argues that this mentality is deeply ingrained within capitalist society. “Of course, once you start picking this apart, you realise very quickly that it makes no sense,” she said. “Plenty of people get money for nothing by, for instance, investing in the stock market. But we don’t critique the wealthy for multiplying their money through passive investing.”

 

A change in mentality
An accusation often levied against universal basic income is that it could make people lazy. When Finland experimented with universal basic income for nearly two years between January 2017 and December 2018, the researchers concluded that while it may unemployed people happier, it did not lead to increased employment.

However, there were several problems with this study. The trial involved only 2,000 people and participants had to take cuts to other forms of government support. Both of these factors made it hard to measure the programme’s impact.

Moreover, as Fouksman points out, other studies have shown the opposite effect. “We actually know from tonnes of randomised control trials experiments and pilot programmes around the world dating back as far as the 1960s is that when people have a guaranteed income, actually, most of the time their participation in the economy increases,” she said, “whether it’s because they are more able to get jobs because they can migrate to places that actually have jobs, or because they’re able to finish their education and apply for more skilled for jobs, or because they now have the capital and the social insurance to take a risk and start their own businesses do all of these roots.”

The idea that it would support entrepreneurism is one shared by many tech leaders. Facebook CEO Mark Zuckerberg said in his Harvard Commencement speech in 2017 that he could only build the company thanks to his family’s financial stability. “The greatest successes come from having the freedom to fail,” he said. “Now it’s our time to define a new social contract for our generation. We should explore ideas like universal basic income to give everyone a cushion to try new things.”

According to Zuckerberg, a universal basic income would unlock equality of opportunity. It’s this line of thinking that has led some to conclude that it should be a basic human right. “You could think about it as a way of righting the wrongs of having an unequal initial distribution of resources, or giving people a rightful share of the wealth generated by previous generations that is currently being captured unfairly or unequally. Or you could think about it as simply a fundamental human right to life – in that, in order to live, you need basic resources.”

However, it could be a long time before the wider population share this view. Perceiving a universal basic income as a fundamental right, rather than another “handout”, would require a huge shift within the belief system of capitalist society. But now could be the optimum time for that shift in perception. Moments of crisis can often serve as catalysts for socio-economic change. Already, populations have had to adjust to new modes of working and more extensive welfare provisions as a result of the crisis brought on by COVID-19. It’s not unfeasible that universal basic income could become part of the legacy that the coronavirus pandemic leaves behind.

Credit Suisse to launch digital banking app in October

Credit Suisse will launch a digital banking app in October, the bank announced on 10 September, posing a challenge to fintechs like Revolut and N26. Its new CSX app will offer a free-of-charge online debt card and other capabilities including mortgage applications, investments and pensions.

Fintechs such as Revolut and N26 have been steadily gaining market share not just in Switzerland but around the world. Revolut, for example, has more than 350,000 clients in Switzerland and over 12 million personal customers worldwide. These fintechs have attracted such huge client bases through their cheap, easy-to-use digital offerings, which have proven particularly popular with younger generations.

So far, incumbent banks have struggled to roll out apps as popular as their digital rivals’. In 2019, JP Morgan shut down Finn, its digital banking app after it failed to attract enough of the younger generation that it was aimed at.

The new CSX app is part of Credit Suisse’s new digital strategy, aimed at streamlining the organisation and attracting younger customers. Last year, Switzerland’s second-biggest bank announced it would invest hundreds of millions of francs in digital services and cut down its branch network. The bank said this August that it was planning to shut 35 branches and merge a subsidiary as part of $110m spending cuts. The remaining branches will be revamped to include “digital bars” where specialists will be able to provide advice via video conferencing and also “event zones” to attract start-ups.

The dominance of the US dollar is called into question

In July 2020, the US dollar suffered its poorest monthly performance for a decade, as the country grappled with the economic fallout of the pandemic. The currency’s tumble has raised concerns that its dominance of the global financial system could be waning. According to data from the Commodity Futures Trading Commission, hedge fund bets against the dollar in futures markets are at their highest level in about ten years. Meanwhile, Goldman Sachs currency strategists have warned that the dollar is in danger of losing its status as the world’s reserve currency.

Many economists think that concerns about the dollar’s demise are over-exaggerated. They argue that a number of short-term factors have contributed to its decline, including the US Federal Reserve’s aggressive monetary easing, aimed at boosting liquidity during the pandemic.

Others disagree that a weaker dollar means it’s necessarily losing influence in the world. “An expensive dollar is the most significant threat to dollar dominance. It is inflationary around the world, increases credit risk, damages balance sheets and limits credit flows. A cheap dollar is everyone’s friend,” said Aaron Cantrell, Director of Economic Research at Record Currency Management.

Nevertheless, its decline in value tells us something important about the US’s changing place in the world’s financial system. While US dollar is likely to remain the world’s reserve currency for the foreseeable future, its depreciation is a sign that the US no longer commands the global trust and confidence that it once did.

 

Having reservations
When the virus first broke out, investors and companies rushed to the dollar. “The pandemic exposed the scale of dollar dependency around the globe,” said Cantrell. “Financial institutions, corporations and governments all scrambled for dollars to cover liabilities and liquidity needs in face of the unknown.”

The rally it saw at the start of the pandemic fulfilled analysts’ expectations; during times of economic uncertainty, people will often flock to safe haven currencies. But this rally was short-lived. Benjamin Cohen, the Louis G Lancaster Professor of International Political Economy at the University of California, explains that this was out of the ordinary.

“In the past, when a major crisis hit the world economy – such as the Latin American debt crisis of the 1980s, the Asian debt crisis of 1997-98, the global financial crisis of 2008-09 – the dollar served as a safe haven. Money would flow into the US – specifically, into US Treasury bonds. So under ordinary circumstances we might have expected to see the same thing today in the midst of the COVID-19 pandemic. But it hasn’t happened this time,” he told World Finance.

The dollar’s sharp decline in value speaks to vulnerabilities in the US economy. US institutions are growing weaker while politics is becoming more dysfunctional. Trump’s economic nationalism has seen the US’s role in global trade and international politics diminish. At the same time, his mismanagement of the coronavirus crisis has seriously eroded trust in the country, both at home and abroad.

“My opinion is that [the decline] is because of the pathetic policy response of the Trump administration, which for many around the world is the last straw,” said Cohen. “For three years the Trump administration has taken steps that undermine the world’s confidence in the US – and by extension, confidence in the dollar. More than ever, investors and central banks are trying to find or promote alternatives to the greenback. The dollar is no longer the default refuge in the midst of a crisis.”

Ideally, the host economy of a reserve currency should play an outsize role in global trade, serve as a global creditor and have a history of monetary stability. Together, these factors encourage partners to draw up contracts in its currency and accumulate the US dollar in reserves. The US no longer fulfills these criteria as clearly as it once did. Its share of global trade has fallen – particularly during the US-China trade war – while alarmingly high levels of public debt undermine its record of stability. This raises doubts over whether the US dollar’s central role in global markets is still warranted.

 

The status quo
The dollar’s role as the world’s reserve currency was established in the 1944 Bretton Woods agreement, which pegged the exchange rate of all currencies to the dollar, which in turn was pegged to gold. This meant that, instead of gold, other countries accumulated reserves of US dollars.

Today, the dollar makes up about 61 percent of all known central bank reserves, according to the International Monetary Fund. In addition, roughly 40 percent of the world’s debt exists in dollars. This brings certain benefits to the US economy. The country issuing the reserve currency is not exposed to the same level of exchange rate risk that other countries are, and can also afford to borrow large sums of capital more cheaply.

In the past, the US has been criticised for the benefits that its reserve-currency status confers. Valery Giscard d’Estain, President of France between 1974 and 2981, chastised the US for receiving “exorbitant privilege” as a reserve-currency holder.

However, reserve status also has its drawbacks. These low borrowing costs can encourage public and private sectors to spend more frivolously, racking up debt as a result. The federal budget deficit is expected to reach around $4trn or around 20 percent of US GDP this year.

Because of its outsize role in global trade, the strength of the dollar has a significant impact on global economic growth. A strong dollar makes it more expensive for other countries to pay for imports, reducing demand and therefore economic activity.

This leaves countries exposed to spillovers from fluctuations in the US economy. “Emerging market governments and economies are more vulnerable to foreign exchange risk, owing to a lower capacity to borrow and transact in their own currencies,” said Cantrell. “This can become a vicious cycle when foreign exchange depreciation makes debt repayment more difficult, possibly triggering financial or balance of payment crises.”

Last year, the former Governor of the Bank of England Mark Carney suggested that central banks come together to create their own replacement reserve currency, to counter the “destabilising” effects of relying on the US dollar. Cantrell explains that, as well as limiting countries’ exposure to economic shocks in the US, a different monetary system could reduce the US hegemony outside the financial system. “Especially when the USA’s place in global finance is leveraged for American foreign policy interests – for example in enforcing sanctions against Iran – this tests the patience of other actors in the system including of Europe. China is also a vocal and proactive opponent of dollar dominance for this reason. This is especially true as China attempts to establish a regional network of infrastructure, technology, and trade independent of the USA signified by the Belt and Road Initiative,” he said.

The pandemic is bringing the destabilising effects of the dollar’s dominance to light. Usually, a strong dollar benefits emerging markets, since weaker exchange rates can make their exports more competitive. But a study by the International Monetary Fund found that the dollar’s dominance could exacerbate the impact of the coronavirus crisis on the global economy, as weaker exchange rates could be less effective shock absorbers than in the past.

 

Warning signs
Central banks may stand to benefit from a more decentralised global monetary system. The question is whether this is a viable option at all. Eswar Prasad, the Tolani Senior Professor of Trade Policy and Professor of Economics at Cornell University, thinks this is currently not the case.

“Concerted efforts by other central banks could lead to a decline in the role of the dollar as the dominant currency for denominating and settling International payments. However, there are no obvious alternatives to the US dollar as a safe haven currency. The euro has stumbled and the renminbi has stalled, leaving no realistic alternatives to the dollar’s status as the dominant global reserve currency,” he said.

The lack of a convincing alternative is why the dollar has been able to command dominance for such a long period of time, according to Cohen. “The euro has been plagued by weak governance and serious public debt problems. Japan’s yen has been weighed down by the long slow decline of the Japanese economy. And the Chinese renminbi remains encumbered by China’s panoply of capital controls and its still relatively primitive financial markets,” he said. “A well-known US economist coined the term the ‘unloved dollar standard.’ The greenback is not loved, but investors and central banks ask: ‘What else is there?’”

Ironically, as well as exposing the dollar’s dominance, the pandemic has also expanded it. “The rapid growth of USD-denominated debt to fund COVID-related expenses around the world—in the form of multilateral loans, sovereign and corporate bond issuance, bank loans, and more—further entrenches demand for dollar liquidity in the future. It also reinforces its use as pricing currency for international transactions,” said Cantrell.

Prasad agrees that the dollar’s liquidity will ultimately strengthen it internationally. “The Fed’s apparent magnanimity in allowing other countries to have access to dollar financing collateralised by their holdings of US Treasuries will pull countries even deeper into the clutches of the dollar,” he said. “The Fed’s provision of abundant dollar liquidity to foreign central banks through currency swap lines and lines of credit collateralised by Treasuries will strengthen the dollar’s dominance in global finance.”

Only once before has a dominant currency been unseated, when the dollar took over from sterling. Such a dramatic shift in the global geopolitical order is unlike to arrive any time soon; in fact, for now, the pandemic will strengthen the currency’s dominance. But the weakening of the dollar suggests that this geopolitical order is nonetheless beginning to fray at the edges. The US should treat it as a warning. By relinquishing global leadership and damaging the credibility of its own institutions, the US risks forfeiting its “exorbitant privilege” once and for all.

US weekly jobless claims fall below one million

The number of US citizens filing claims for unemployment benefits fell below one million for the second time since the coronavirus pandemic started, the US Labour Department reported on 3 September. However, the Department cautioned that this did not translate to a strong recovery in the labour market.

New US weekly jobless claims fell to 881,000 for the week ended 29 August. This was below the estimate of 950,000 which had been forecast by economists Bloomberg polled.

The fall in initial claims partly reflects a methodological change. The government had dropped the multiplicative seasonal adjustment factors it had been using because they were made less reliable by the pandemic’s economic impact. Unadjusted claims rose to 833,352 in the same week.

Since the pandemic began, more than 59 million unemployment claims have been filed in the US, far outstripping the 37 million claims filed during the Great Depression. While the labour market is showing signs of improvement, economists warn that the pace of progress has slowed since an initial bounce in May and June. This implies that it could be a long time before the US economy recovers.

The data is likely to prompt calls for more economic stimulus in the country. Currently, Democrats and Republicans are deadlocked over the details of the next coronavirus pandemic bill.

In July, a $600 weekly unemployment supplement expired, cutting income for millions of unemployed people in the US. President Donald Trump signed an executive order for a $300-per-week federally funded jobless benefit for workers, with an additional $100 provided by states. However, the benefit may only last a few weeks, leaving out-of-work Americans in financial trouble once again.

The significant impacts an EU recovery fund would have

The announcement dropped like a bomb in European capitals, most of which were still under strict lockdowns. In a joint press conference, German Chancellor Angela Merkel and French President Emmanuel Macron proposed an EU recovery fund that would offer €500bn ($569.2bn) in grants as an economic lifeline to pandemic-stricken members of the union. Authorities on recent EU history hailed this as a Hamiltonian moment, a reference to Alexander Hamilton, the visionary who spearheaded the federalisation of states’ debt in the US.

 

The proposal stopped short of mentioning eurobonds, a financial instrument collectively guaranteed by EU member states that has become a bone of contention in the bloc’s response to the novel coronavirus. And yet, it was instantly recognised as a bold step towards bringing the union closer to what was hitherto unthinkable: joint debt issuance, a typical feature of fiscal unions. Wolfango Piccoli, co-president of political risk advisory at Teneo, a US management consulting firm, told World Finance: “The French-German [proposal] broke two fundamental taboos: it opened the possibility for European governments to engage for the first time in massive joint borrowing, and sanctioned significant fiscal transfers between its member states.”

 

Beware the frugal four
Just 10 days after Macron and Merkel let the cat out of the bag, the European Commission announced its own plan. It was even more generous, offering an extra €250bn ($284.5bn) in loans on top of the €500bn grants proposed in the French-German plan. The funds will be raised via EU-issued bonds and financed through a series of new taxes and levies. These include staples of the EU repertoire, such as taxes on large corporations and tech powerhouses, as well as measures reflecting Brussels’ Green Deal, including taxes on carbon and plastic. The €750bn ($853.6bn) recovery fund, aptly called Next Generation EU, incorporates the essence of the French-German proposal and also adds ideas from countries that are less enthusiastic about shared debt. Michael Hüther, a German economist and director of the German Economic Institute, told World Finance: “The commission’s proposal clearly bears the signature of the German and French Governments, as it includes a high level of transfer. The question is, however, whether this high level is necessary to help the affected states in the current situation.”

The timing and innovative set-up of the EU recovery fund has boosted the hopes of Europhiles that something bigger is in the works

The proposal comes with various conditions that make it less ambitious than what its main beneficiaries were hoping for – grants will not be used to finance existing debt, for example. Its timing and innovative set-up, however, has boosted the hopes of Europhiles that something bigger is in the works. Bonds will be issued in the name of the EU, while the commission will oversee fund allocation. For over-indebted countries with volatile sovereign credit ratings, this will be a boon, as the bonds will have the coveted AAA rating that puts them into the ‘safe asset’ category. But Hüther believes the impact on the EU’s coffers remains a concern: “The repayments will place a heavy burden on the EU budget for many years, from 2028 onwards. EU taxes proposed by the commission to finance the fund are unlikely to find a majority among member states.”

The commission needs to convince all member states that its plan is the best way to move forward. Persuading Austria, Denmark, the Netherlands and Sweden – a bloc that has been named ‘the frugal four’ for its aversion to shared debt – will take a lot of effort and possibly some concessions. A few days before the commission announced its proposal, the frugal four presented a different recovery policy, offering loans rather than grants and emphasising the temporary character of any intervention. However, the commission’s plan is expected to get the green light in one form or another, given that it bears the stamp of the Franco-German engine that traditionally spearheads reform in the EU. Piccoli said: “The negotiation will be a tough one, but given that Germany is the biggest contributor, it will go a long way to convince some of the reluctant countries.”

 

EU turn
France has always been a champion of debt mutualisation, driven by its precarious economic position – the country’s public debt is approaching the 100 percent debt-to-GDP threshold (see Fig 1). Macron is also a staunch Europhile with bold ideas for the future of the bloc. Until recently, though, Germany was the unofficial leader of the frugal group: several economists, including former Greek finance minister Yanis Varoufakis, floated the idea of issuing eurobonds during the sovereign debt crisis, only for it to be rejected by the German Government. This is why Merkel’s sudden embrace of the idea has come as a surprise.

 

Some point to fierce pressure from Ursula von der Leyen, who was the longest-serving member of Merkel’s cabinet before becoming president of the European Commission, as a possible explanation. The fact Germany will take over the European Council presidency in July and lead negotiations on the EU’s 2021-27 budget might also have played a role. Others point to more pragmatic reasons, such as concerns over Italy’s soaring debt, which currently sits above €2.4trn ($2.73trn), several times more than that of Greece. Adding further debt to tackle the consequences of the lockdown would make Italy’s recovery more difficult.

Some cracks in the opposition to eurobonds emerged in March, when a group of influential German economists published an article via several European media outlets, calling for the issuance of €1trn ($1.14trn) in crisis bonds. Hüther, who was among the authors, told World Finance: “The union is sending a strong signal of European solidarity in a situation where the cost of borrowing at the European level is very low.” The German public had started to warm to the idea at the peak of the COVID-19 pandemic, with the local press stressing the importance of European solidarity during a global healthcare crisis.

Some fear that the wounds to European solidarity will take a long time to heal

As Jonathan Hackenbroich, a policy fellow for economic statecraft at the Berlin branch of the European Council on Foreign Relations, explained to World Finance: “The German economy is dependent on exports [see Fig 2] and a liberal trade order. With that being more difficult internationally, the government knows that a strong EU market becomes more important, and Germany can’t just focus on exports to third countries.” He added that developments on the other side of the Atlantic might have influenced the German Government’s decision: “Germans and [other] Europeans can’t make their own economic decisions in some instances anymore because of US economic nationalism. The dollar, which Europeans used to view almost as a public good, is getting weaponised. That’s partly why the German Government recognises how important it is to have a strong European market.”

 

 

Merkel’s political calculations may have played a role, too. The German chancellor is expected to step down next year, giving her leeway to make difficult decisions without taking the political cost into account. Hackenbroich believes the successful management of the healthcare crisis in Germany has led to a renaissance of ‘Merkelism’: “The reason why [Merkel] can dare to make concessions is that she is highly popular. Her party is leading the polls by a wide margin because she [has] handled the crisis really well so far. German people are happy that their leader is Merkel and not someone like [UK Prime Minister] Boris Johnson or [Brazilian President Jair] Bolsonaro.”

 

 

Pulled in different directions
The ambitious French-German proposal couldn’t have come at a more crucial time for European unity, which is being challenged by two parallel crises: the pandemic, and the heated debate over how to respond to the economic tsunami caused by strict lockdowns. Old grievances, thought to be dormant since the worst days of the Greek debt crisis, have come back to the fore. The push for debt mutualisation was led by Spain and Italy – the two countries that took the biggest hit during the early stages of the pandemic – and has been backed by Portugal, France, Ireland and Greece.

Joint debt issuance may be seen as a pattern to be followed in the future, but it is a prospect that will likely be met with fierce resistance

Frugal member states from the North were having none of it, though, as they were wary of moral hazards that could delay reforms in Southern Europe. In a Eurogroup meeting via videoconference in late March, the Dutch finance minister Wopke Hoekstra sparked uproar when he demanded that Brussels investigate why some countries were not prepared for a financial crisis just a few years after the previous one. Not one for mincing his words, Hoekstra categorically rejected eurobonds as an irrelevance. To southern ears, this was nothing more than hubris while the virus claimed the lives of thousands of people daily. Dropping all pretence of diplomatic courtesy, the Portuguese Prime Minister António Costa dismissed Hoekstra’s remarks as “senseless” and reminiscent of the eurozone’s recent woes: “No one has any more time to hear Dutch finance ministers as we heard in 2008, 2009, 2010 and so forth.”

Costa’s remarks reflect the deep frustration that can be found in Southern Europe over what was deemed to be unwarranted virtue signalling from the frugal North during an unprecedented healthcare crisis. Member states’ political leanings also contributed to the acrimony, with the left-wing governments in Portugal, Spain and Italy protesting that a decade of austerity had left them with little leeway to support their economies while generous bailout packages were needed for companies and employees. Passions ran high, with Italian politicians going as far as accusing the Netherlands of being a tax haven.

One of the lasting impacts of the COVID-19 pandemic may be the transfer of more power to Brussels

The tension exposed the new internal dynamics within the EU. The UK’s departure has created a gap in the balance of power between France, which usually sides with southern members, and Germany, previously the leader of the frugal North. Onno de Beaufort Wijnholds, a Dutch economist who previously served as executive director of the IMF, has suggested that Germany may have welcomed the eagerness of the Netherlands to take up the mantle of fiscal probity: “It may well be that Germany prodded its neighbour to lead the opposition, thus having the initial Italian wrath directed at the Netherlands. The Netherlands – like Germany and some other northerners, but this time [in a] more outspoken [manner] – does not wish to participate in a transfer union without some conditionality. Without it, we might see Italy becoming a ‘super Greece’.”

Some fear that the wounds to European solidarity will take a long time to heal, with Euroscepticism rising in Spain and Italy, which have hitherto been deemed as bastions of the EU. In the eyes of Lorenzo Codogno, former chief economist at the Treasury Department of the Italian Ministry of Economy and Finance, the euro has become an anathema for many Italians. As he explained to World Finance: “The root of the problem is that Italy is the only country in the eurozone that still has real GDP per capita slightly below the level [it had] when the single currency was launched in 1999. It is not the euro’s fault, but it is too easy to make a connection between the two phenomena.”

 

 

En garde, Madame Lagarde
Italian sentiment towards the EU was further bruised in March, when European Central Bank (ECB) President Christine Lagarde remarked that “the ECB is not here to close spreads”, referring to differences in eurozone members’ borrowing costs. The timing couldn’t have been more unfortunate, as Italy was facing the peak of the COVID-19 pandemic. The country’s bond yields were already reaching levels reminiscent of the sovereign debt crisis and Lagarde’s comment sent them even higher. Many compared her insouciance with the conduct of her predecessor, Mario Draghi, who, at the peak of the Greek crisis, said the ECB would do “whatever it takes to preserve the euro” – a statement that boosted market confidence in the eurozone.
Piet Haines Christiansen, Chief Strategist (ECB and Fixed Income) at Danske Bank, told World Finance: “[Lagarde] got off [to] a rough start with that comment. In retrospect, it was right, but it is not something that markets wanted to hear.”

Christiansen went on to argue that the ECB seems to have adopted an approach close to the ‘Greenspan put’ principle that reigned supreme in the 1990s, ensuring that spreads stay under control without actively intervening to close them.

The furore over Lagarde’s comment cast the ECB into the centre of the debate over the future direction of the eurozone. Since the sovereign debt crisis, the bank has been propping up the continent’s financial system through quantitative easing and bond-buying programmes, vastly expanding its balance sheet.

The same approach was followed in March when the bank launched a new €750bn ($853.6bn) pandemic emergency purchase programme (PEPP) that aimed to support pandemic-hit countries and companies, while its public sector purchase programme (PSPP) kept serving as a backstop for sovereign debt. A key goal for the bank is to prevent a ‘doom loop’ of rising sovereign credit risk that drags down banks in the weakest members of the eurozone. Christiansen said: “The ECB can continue the PSPP and PEPP programmes for as long as it takes. We should never underestimate what they can do. They set the rules of the game.”

Even before the pandemic, critics were pointing to the limits of this approach. Many economists have warned that monetary stimulus has artificially inflated asset prices and hit savers through negative interest rates while supporting indebted southern member states. Others have stressed the need for fiscal stimulus driven by governments. This now seems inevitable: as the EU builds up its defences through the commission’s recovery plan, the ECB is expected to take a less active role. Hackenbroich said: “Merkel has been clear that governments should shoulder some of the burden of the ECB. There will be more of a balance, but the ECB will remain key to eurozone policies.”

 

The first step
Time is pressing for speedy solutions as the economic consequences of the pandemic become clearer (see Fig 3). The ECB has warned that the eurozone’s economy will shrink by eight to 12 percent in 2020. Merkel, meanwhile, has said consultations with national parliaments and the European Parliament should be concluded in the autumn. Some expect that a typical ‘Eurofudge’ deal will be made at the last minute. Wijnholds told World Finance: “A compromise will most probably be reached, leaving both parties somewhat unhappy with a result that they can sell to their home base.”

 

 

Merkel and Macron have recognised that their plan was a temporary response to the pandemic, with more robust action needed in the future. Joint debt issuance may be seen as a pattern to be followed in the future, but it is a prospect that will likely be met with fierce resistance. Hüther said: “The proposals should not lead to structural changes in the EU financial architecture or repeated borrowing at the European level. The fund entails the risk, however, that in future crises the commission will very quickly press for further EU borrowing.”

The COVID-19 pandemic has reasserted the importance of the nation-state. Borders were re-established between Germany and France, even if temporarily. One of the lasting impacts of the pandemic, however, may be the transfer of more power to Brussels.

The fact that Germany insisted on tying recovery policies with the bloc’s long-term objectives, such as the EU’s Green Deal and digital transformation, points to a deeper commitment to the European project. Some see an idiosyncratic fiscal union rising from the embers of post-pandemic Europe, with the next step being a common budget for the eurozone – a pet project of the French president. As Hackenbroich explained, Europhiles may only have to play the waiting game: “[A] fiscal union is too far-fetched yet, although [the French-German proposal] is a step towards it. More taboos will have to be broken for that.”