Two IT professionals working on computer program in office
Article

Gain confidence ahead of your next regulatory exam: the benefits of completing a thorough model validation

When it comes to risk, time is of the essence. Prepare your financial institution for its next regulatory review – and cross the CECL finish line. In this informative article, our financial services specialists explain what the best practices are when considering model validation – from a full review of your model governance to a deep dive into data inputs and assumptions.

In preparation for the Dec. 31, 2022, Current Expected Credit Losses (CECL) implementation dates, banks have focused on developing their CECL model. Now it is time to cross the CECL finish line by completing a thorough model validation. A proper validation will give your institution confidence in its next regulatory review, tie up any loose ends or inconsistencies within the model, documentation or process, and produce an outlook of all types of models, methods, vendors and data to provide recommendations for best practices. Even if your model does not demonstrate any inconsistencies or loose ends, it will ensure the model is operating as intended. A proper model validation is able to answer your most pressing questions: What are we missing? What can we do better? Should we be tracking other data? And most importantly: Are we compliant?

Why perform a model validation?

Model validation is the set of processes and activities intended to verify that models are performing as expected, in line with their design objectives and business uses. The use of models in any case invariably produces model risk, which is the potential for adverse consequences from decisions based on incorrect or misused model outputs and reports. Model risk can lead to financial loss, poor business or strategic decision-making and reputation damage for your organization. It is prevalent and can be caused by misusing a model in some way or using incorrect data, which is why a flexible and comprehensive model risk management strategy is important for every financial institution.

Until now, most of these model outputs have not made it into financial statements. Hence why CECL model validations are so important – these numbers are now not only used to make informed financial decisions but are also featured in an institution’s financial statements. It is essential to be confident that the model is working as intended and that the data is accurate and complete, and therefore the outputs make sense and are reliable.

An effective validation helps ensure that models are sound and also identifies potential limitations and/or assumptions and assesses their possible impact. All model components, including input, processing and reporting should be subject to validation. This applies equally to models developed in-house and to those purchased from or developed by vendors or consultants.

It is important to remember that one organization’s model validation typically looks different from another’s. If your institution is using a software, that vendor most likely had a model validation performed on its software. A vendor validation is a validation of the mathematical formulas and proprietary algorithms of the vendor-based model. It is not client or data specific, and ultimately has nothing to do with your data or your organization. Its main purpose is to figure out whether the math behind the model makes sense.Because these models are running very complex algorithms that oftentimes look at multifactor regressions and forecasts in order to estimate a CECL model, it is vital to ensure that the mathematical formulas are sound.

A client-specific validation is a validation of client’s data within the model. It is an internally-developed, Excel-based template validation that seeks to understand how the client’s contractual data is being uploaded and run within the model, and how the assumptions are being applied within the model. There are multiple steps in the process of getting data into the model, with different inputs and segmentations used along the way. As the data is being manipulated outside of the software, there is always risk that the data that gets uploaded to the model will change or upload incorrectly. A client-specific validation will check the process and ensure everything is complete and accurate and the client’s data is being modeled properly. 

While vendor validations are important to make sure the model calculations are accurate, we believe client-specific validations are key to make sure your data is working properly within the model. Depending on the complexity of a portfolio, some financial institutions choose to adopt several different model validation methods.  

Regulatory environment

Model validations continue to be an increased focus during regulatory examinations, especially after recent events in the banking industry. Considering we are still in the infancy of CECL-based models, it is essential to ensure they are working properly. Banks should conduct a periodic review – at least annually, but more frequently if warranted – of each model to determine whether it is working as intended and if the existing validation activities are sufficient. Any institution that is subject to regulatory examinations should have a model risk management program in place to ensure all models are observed.

An effective validation framework should include three core elements:

Financial institutions should employ sensitivity analysis in model development and validation to check the impact of small changes in inputs and parameter values on model outputs to make sure they fall within an expected range. Assessing the quality of model design and construction includes:

  • An assessment of key assumptions and the choice of variables
  • Comparison to alternative theories and approaches
  • An understanding of the relevance of the data used to build the model

Ongoing monitoring confirms that the model is appropriately implemented and is being used and is performing as intended and includes process verification and benchmarking. Process verification ensures that all model components are functioning as designed. Benchmarking involves comparison of a given model’s inputs and outputs to estimates from alternative internal or external data or models. 

Outcome analysis is the comparison of model outputs to corresponding actual outcomes. Back-testing is one form of outcomes analysis. It involves the comparison of actual outcomes with model forecasts during a sample time period not used in model development and at an observation frequency that matches the forecast horizon or performance window of the model.

Key components of a CECL model validation

The comprehensive model validation framework includes four recommended phases. For a more detailed explanation of each phase, refer to this article focusing specifically on CECL model validations.

Developing and maintaining strong governance, policies and controls over the model risk management framework is fundamentally important to its effectiveness. All financial institutions that rely on models should implement an appropriate governance program, and all aspects of model risk management should be covered by suitable policies. Clear and comprehensive model documentation is also critical to providing internal and external parties an understanding of the final model.

Many model validation software packages are including qualitative factor (q-factor) adjustments and stress tests. However, most institutions are choosing to have q-factors adjustments done outside of the model validation by sticking them into an Excel-based template and then working them back into the model later. For more information on q-factors and forecasting, refer to the recordings of recent webinars our CECL specialists hosted regarding q-factors for both basic and complex entities, as well as our recent article that highlights best practices for banks when implementing q-factors.

The complex method of data inputs includes several pieces of information, including payment type, payment amount, interest rate, maturity rate, payment frequency, amortization, prepayment rate, probability of default, loss given default and recovery delay. All of these data fields are very important, and many inconsistencies and mistakes are being made within the different structures of the data pool and inputs. 

The simple method also includes a lot of data, especially depending on the model you are using, but also includes factors such as outstanding balances, average annual loss rates, remaining life, and reasonable and supportable forecasts. Quality control checks concerning data are essential to ensuring a proper model validation. 

Monitoring and back-testing checks to see if an institution’s model history is consistent with what is seen on the call report – average annual loss rates, for example. Looking at the model history ensures that all of the historical data for the assumption – not just the actual cash flows – are reasonable for your specific institution. This is also essential if you are using peer data from a similar sized institution that has had similar losses.

Model testing includes looking at a full replication of the model in order to review the model framework. It includes assumption testing, scenarios and stress testing analysis and back-testing as well as category level reviews.

There are many benefits to a model validation, as it allows you to validate all parts of the validation process from start to finish and you are able to ensure data quality control. You are also able to learn more about how assumptions and methodology impact your profile, and obtain recommendations on recommended policy, documentation and governance. When it comes to addressing risk, time is of the essence – cross the CECL finish line and give your institution confidence in its next regulatory review by completing a thorough validation.

Reach out to our CECL specialists to discuss best practices when considering a model validation and lessons learned from the validations that they’ve completed to date.

Ivan Cilik
Partner
Higher education Institutional data reporting risks
Next up

Reporting institutional data: are you taking a proactive approach?