This is an extract of a speech presented by Rismark’s Christopher Joye at last week’s LIXI Industry Forum:

One of the problems with managing the Byzantine nexus between extreme asset price and credit cycles, and the ordinarily adverse ramifications of these events for our real-economy, is that policymakers have historically had very poor credit data.

As the RBA spent considerable time highlighting in this Financial Stability Review , the measurement of seemingly simple statistics like mortgage default rates is an exceedingly complicated exercise both within countries and across nations. While this might sound esoteric to some, getting an understanding of the risks accompanying the $1 trillion worth of Australian mortgage debt is vital to all of our welfare.

Yet even these default rates are an ex post expression of duress. That is to say, you only find out about it after the risk has already materialised. In fact, the probability of default on a home loan tends to peak a full two to three years after the date on which a loan is originated. A more valuable leading indicator of financial stress that could be used to predict changes in default rates over time would be ‘live data’ on the quality of the assessment standards employed by all lenders when they extend credit.

To the best of my knowledge, it has been awfully difficult for policymakers to get access to this critical real-time information. This is also true of investors in the residential mortgage-backed securities (RMBS) sector, where issuers (ie, lenders) do not ordinarily provide investors with data on the debt serviceability ratios underpinning the individual loans that they are vending into the market. They too are stuck with after-the-event default rates.

Getting data on dynamic changes in lending standards over time is crucial precisely because these standards are not static. History tells us that credit standards are highly pro-cyclical — money is easy to access during the good times, and hoarded by suddenly risk-averse lenders during the bad.

In many ways, the GFC was simply an echo of the coincident asset price and credit boom (and subsequent bust) that occurred during the late 1980s with the distinguishing characteristic that twenty years’ hence global capital markets are much more interdependent due to the profound information and communications technology revolutions that have literally changed our way of life (aka the Internet). Oh and substitute in sub-prime loans for that 1980s innovation, junk bonds.

Another important difference this time around is that consumer, business and institutional investor confidence appears to be even more fickle and volatile than in days gone by because of the mind-boggling speed with which information in transmitted around the globe.

So when in the 1980s households had to wait for information to slowly percolate its way through the print and television news cycles, today consumers are fed this content in shockingly voyeuristic fashion. It’s like capitalism has become a perverse version of reality TV. And the velocity of this process means that the editorial and diligence standards that were previously applied to interrogating information have been frequently diluted (or, at the very least, consumers absorb much more ‘raw’ content than they have ever before).

The tyranny of distance is long gone — today we face the tyranny of virtual proximity. And so we had the butterfly effect that was US households defaulting on their loans quickly causing chaos around the world, including, but not limited to, the first run on a UK bank since 1866, the partial nationalisation of many private banking systems, and wholesale changes of government.

These new dependencies between economic events, the release of information, consumer and business sentiment, and their feedback into real behaviours warrant detailed study by academic researchers. They also mean that policymakers and politicians have arguably higher duties of care to faithfully communicate with the nation as opposed to seeking to mercenarily exploit these relationships for short-term gain (eg, by exaggerating the nature of problems we face).

All of this is a rather round-about way of saying that regulators also need to think more creatively about how they monitor, measure and ultimately manage risk. And in this context, I have a policy proposal. I have put this to government economists and industry participants, and they have, without exception, enthusiastically embraced it. The only thing preventing us from implementing constructive policy is political will.


I would propose that the Commonwealth establish a central electronic ‘clearinghouse’ of all residential, personal and business credit originated in Australia. For simplicity’s sake, let’s call it the National Electronic Credit Register (NECR).

If you think about it, credit is effectively an over-the-counter (OTC) contract. There is no centralised exchange novating the relationship between the parties as we see, for instance, with companies listed on the stock exchange, or with listed derivatives and futures contracts. In the latter cases, the ASX acts as a both a contractual and informational intermediary. NECR’s role would be purely around the transmission of information between parties.

As we discovered during the GFC, one of the profound shortcomings associated with OTC markets is that they effectively eviscerate transparency. The only people who know what is going on are the counterparties themselves. This causes significant information ‘asymmetries’ that destroy confidence, and is one of the principal explanations for the evaporation of liquidity in many markets during the crisis.

In Australia, APRA and the RBA do collect a great deal of ex post facto credit data. But this is normally aggregated information and does not tell them much about the individual loan-by-loan risks. It also does not necessarily furnish them with any insights into the ex ante, or before the event, credit assessment standards employed by lenders.

The establishment of NECR would presumably be very straightforward. All Australian lenders have electronic lodgement processes and there are standardised communications formats that allow lenders to communicate with one another (in the mortgage market this is known as LIXI (or the “language of lending)).

APRA, the RBA, and ASIC (to cover non-banks) could, therefore, simply insist that any licenced entity involved in the creation of personal, residential or business credit sends NECR a simple data packet upon the settlement and, notably, discharge (ie, repayment) of every single loan. The lender’s transmission to NECR would contain, amongst other things:

1) A unique loan identification code (so that NECR can track the loan);
2) The loan amount;
3) The loan type (eg, 3 year fixed)
4) The interest rate;
5) The settlement/discharge date;
6) The collateral value (eg, property value);
7) The collateral address; and
8) A nationally-defined debt serviceability standard measuring the ability of the borrower to meet the repayments on the loan (all lenders use these in one form or another, so it should be easy to define a standard metric that they have to supply).

NECR might also provide the architecture required to the transmission of broader customer information between parties. That is, it might help resolve the current bottlenecks around customers switching accounts between institutions, which in turn stifles competition. I am told that one of the main obstacles to switching is the absence of the necessary electronic linkages between institutions involved in the deposit-taking and/or credit creation business. NECR could be employed as a centralised hub that any institution could use to transfer customer data to another once they are instructed to do so by the customer.

NECR would also revolutionise the RBA and APRA’s approach to risk-management. Allow me to illustrate a few examples. In the shadow of the GFC, regulators are understandably worried about system-wide debt levels (or ‘leverage’) and the rate of change in credit over time (ie, the process of gearing up, or deleveraging).

These concerns are made all the more acute given the recent tapering in national house prices combined with the spectre of further rate rises. But what arguably gets regulators most energised is the so-called ‘distribution’ of these risks. That is, those borrowers sitting in the far tails of the distribution that carry the highest hazards. But disaggregating this information when banks are sending you summary statistics is an arduous task (obviously the regulators insist on some disaggregated data).

So rather than simply calculating a system-wide loan-to-value ratio (LVR) by dividing the total amount of mortgage debt (ie, circa $1 trillion) by the total amount of residential property outstanding (around $3.5 trillion), or for the pedants, the amount of property with mortgage debt held against it (about half based on the 2006 Census), the regulators would be able to measure the individual LVRs of every single loan in the country. And they would get live updates on changes in those LVRs as loans were regularly refinanced, which they are.

Since the average home loan’s life is only around 4 years (due to refinancing), it would not be long before they had individual records on all outstanding debt.

More significantly though, real-time information on the specific lending criteria employed by institutions (as proxied by, for example, LVRs and a standardised debt serviceability metric, which does not currently exist) would allow authorities to act assertively in the upswing of concurrent asset price and credit booms rather than picking up the pieces after the bubble has burst.

A more expansive iteration of NECR would have all institutions report monthly changes in account balances and default rates during the life of their products.

Regulators could then make anonymised variants of this information aggregated at a granular, say, suburb level available to all institutions that participate in the exchange. And so, if lender A was concerned about the volume of lending, serviceability standards, and LVRs associated with home loans in, say, Perth, they could request this data from NECR, and modify their own practices as appropriate.

As a final observation, there is today technology available that enables lenders to revalue every single home in Australia on a daily, weekly or monthly basis. These Automated Property Valuation Models (AVMs) are highly accurate on a ‘portfolio’ basis. Armed with this weaponry, regulators could literally quantify the LVRs and equity held in every home in the country as frequently as they desired in order to compute the fallout associated with multi-standard deviation property price falls (aka ‘black swans’). Now that’s what I call 21st century policymaking.

In a world in which the relationship between the state and the various institutions it regulates is more intertwined, NECR seems like a sensible aspiration.

This is an extract of a speech presented by Rismark’s Christopher Joye at last week’s LIXI Industry Forum.