Bank forecasting needs to become more than a “check-the-box” motion for regulatory compliance. It needs to be treated as a strategic decision-making tool.
Traditional financial institutions take deposits from customers and use them to make loans. But they loan out much more than what they have in store at a given point in time — a concept known as fractional banking. On one hand, the difference between the interest on the loans and the interest paid to depositors is referred to as the net interest margin and determines a bank’s profitability. On the other hand, the difference between the assets and liabilities is referred to as their equity and determines the bank’s resilience to external shocks.
Before the latest run on the bank, SVB was viewed as not only a profitable banking institution but also a safe one because it held $212 billion in assets against roughly $200 billion in liabilities. That means they had a cushion of $12 billion in equity or 5.6% of assets. That’s not bad, although it is roughly half the average of 11.4% among banks.
The problem is that recent actions by the United States federal reserve reduced the value of long-term debt, to which SVB was heavily exposed through its mortgage-backed securities (roughly $82 billion). When SVB flagged to its shareholders in December that it had $15 billion in unrealized losses, wiping out the bank’s equity cushion, it prompted many questions.
Related: USDC depegged, but it’s not going to default
On March 8, SVB announced it had sold $21 billion in liquid assets at a loss and stated that it would raise money to offset the loss. But that it announced a need to raise more money — and even considered selling the bank — concerned investors significantly, leading to roughly $42 billion in attempted withdrawals from the bank. Of course, SVB did not have sufficient liquidity, and the Federal Deposit Insurance Corporation took over on March 17.
The macro-finance literature has a lot to say about these situations, but a good summary is to expect highly non-linear dynamics — that is, small changes in inputs (the equity-to-asset ratio) can have substantial changes on output (liquidity). Bank runs may be more prone during recessions and have large effects on aggregate economic activity.
Pursuing structural solutions
To be sure, SVB is not the only bank that has higher and risky exposure to macroeconomic conditions, such as interest rates and consumer demand, but it was just the tip of the iceberg that hit the news over the past week. And we’ve seen this before — most recently during the 2007–2008 financial crisis with the collapse of Washington Mutual. The aftermath led to a surge in financial regulation, largely in the Dodd–Frank Act, which expanded the authorities of the Federal Reserve to regulate financial activity and authorized new consumer protection guidelines, including the launch of the Consumer Financial Protection Bureau.
Of note, the DFA also enacted the “Volcker Rule,” restricting banks from proprietary trading and other speculative investments, largely preventing banks from functioning as investment banks using their own deposits to trade stocks, bonds, currencies and so on.
The rise of financial regulation led to a sharp change in the demand for science, technology, engineering and math (STEM) workers, or “quants” for short. Financial services are especially sensitive to regulatory changes, with much of the burden falling on labor since regulation affects their non-interest expenses. Banks realized that they could reduce compliance costs and increase operational efficiency by increasing automation.
And that’s exactly what happened: The proportion of STEM workers grew by 30% between 2011 and 2017 in financial services, and much of this was attributed to the increase in regulation. However, small and mid-sized banks (SMBs) have had a more challenging time coping with these regulations — at least in part due to the cost of hiring and building out sophisticated dynamic models to forecast macroeconomic conditions and balance sheets.
The current state-of-the-art in macroeconomic forecasting is stuck in 1990 econometric models that are highly inaccurate. While forecasts are often adjusted at the last minute to appear more accurate, the reality is that there is no consensus workhorse model or approach to forecasting future economic conditions, setting aside some exciting and experimental approaches by, for example, the Atlanta Federal Reserve with its GDPNow tool.
Related: Lawmakers should check the SEC’s wartime consigliere with legislation
But even these “nowcasting” tools do not incorporate vast quantities of disaggregated data, which makes the forecasts less germane for SMBs that are exposed to certain asset classes or regions and less interested in the national state of the economy per se.
We need to move away from forecasting as a “check-the-box” regulatory compliance measure toward a strategic decision-making tool that is taken seriously. If the nowcasts do not perform reliably, either stop producing them or figure out a way to make them useful. The world is highly dynamic, and we need to use all the tools at our disposal, ranging from disaggregated data to sophisticated machine learning tools, to help us understand the times we’re in so that we can behave prudently and avoid potential crises.
Would better modeling have saved Silicon Valley Bank? Maybe not, but better modeling would have increased transparency and the probability that the right questions would be asked to prompt the right precautions. Technology is a tool — not a substitute — for good governance.
In the aftermath of Silicon Valley Bank’s collapse, there has been a lot of finger-pointing and rehashing of the past. More importantly, we should be asking: Why did the bank run happen, and what can we learn?