Independent verification and validation, IVV, model validation, validation or; as Shakespeare might say, a validation by any other name is still a validation. Regardless of the name it has evolved and been redefined over time. This article will cover the origin, OCC guidance, regulatory pressure, IVV methodology, reporting, independence, and vendor due diligence in an effort to provide oversight on the model validation process.
Background
The term IVV started back in 1970 with the ballistic missile program (Wikipedia, 2014). It was used to independently validate software packages for targeting and processing ballistic missiles. This had obvious importance. It was adopted by the Department of Defense (DOD) for other software programs and eventually by NASA (2012) in 2004. It is widely use by the European Space Agency, NASA, FAA, DOD, and has been formalized in the IEEE 1012 (NASA, 2012). The overall purpose was to validate that the software was performing as designed and achieved the expected results.
Today, the IVV methodology has been leveraged by the OCC in their Supervisory Guidance on Model Risk Validation, publication SR 11-7. Because institutions rely so heavily on software models to manage compliance and risk within an institution, they need to be periodically reviewed and validated. Systems, data, risks within the institution, products and services all change over time, so these models need to be refreshed to meet these changing variables periodically.
Figure 1: Example of anti-money laundering (AML) software programs.
There are many reasons why these variables change; such as mergers, new product and service releases, new regulatory concerns and challenges, new emerging threats, new technology and new data. The only way to address this is through continuous/periodic monitoring, testing and validating. “Banking organizations should conduct a periodic review-at least annually but more frequently if warranted-of each model to determine it is working as intended and if the existing activities are sufficient (Board of Governors of the Federal Reserve System (SR 11-7), 2011, p. 3).
Regulatory Pressure
We are seeing pressure from regulators at all levels. Though the pressure is different based on region and regulatory agency, most are asking for model validations to be completed through exit interviews and exit letters as MRA and MRIAs. The number of IVVs or model validations has been increasing significantly in the past couple of years. This may be for two reasons, one, technology and automated solutions, software, has been in place now for five to ten years in most institutions and has not been reviewed or revised significantly. Two, the global reach of these institutions are greater. Even community banks and credit unions that are local or regional are seeing the affects.
Model Validation
First, what is a model? According (Board of Governors of the Federal Reserve System (SR 11-7), 2011):
…the term model refers to a quantitative method, system, or approach that applies statistical, economic, financial, or mathematical theories, techniques, and assumptions to process input data into quantitative estimates… The definition of model also covers quantitative approaches whose inputs are partially or wholly qualitative or based on expert judgment, provided that the output is quantitative in nature. (p. 2)
Figure 2: Example page from an IndependentVerification and Validation (IVV) report.
Thus a model is any process/system that translates, manipulates, aggregates, summarizes, calculates, and decisions data. An example of this is your anti-money laundering (AML) transaction monitoring program. It detects suspicious activity or anomalous activity and presents it for review. The challenge is in what is not presenting, that is where we ask, should I have seen that or not. It is what we do not see that is the real risk to the institution. Models are use in a number of places:
Models meeting this definition might be used for analyzing business strategies, informing business decisions, identifying and measuring risks, valuing exposures, instruments or positions, conducting stress testing, assessing adequacy of capital, managing client assets, measuring compliance with internal limits, maintaining the formal control apparatus of the bank, or meeting financial or regulatory reporting requirements and issuing public disclosures. (Board of Governors of the Federal Reserve System (SR 11-7), 2011, p. 2)
Model validation is a risk control, helping us to manage areas of automation that range from low to high risk. This automation alone helps to reduce risk through the removal of human error, volume challenges, and data trends. However, it introduces risk through type 1 and type 2 errors, tuning, data, business, and regulatory changes.
“Model risk occurs primarily for two reasons:
- The model may have fundamental errors and may produce inaccurate outputs when viewed against the design and objective and intended business uses…
- The model may be used incorrectly or inappropriately.”
(Board of Governors of the Federal Reserve System Office of the Comptroller of the Currency (SR 11-7a1), 2011, p. 3)
As the variables to the model become more and more out of date inaccuracies can be introduced. Also many of us try to leverage technology to solution for challenges that it was not intended or designed for.
Components
“All model components—inputs, processing, outputs, and reports—should be subject to validation; this applies equally to models developed in-house and to those purchased from or developed by vendors or consultants.” (Board of Governors of the Federal Reserve System (SR 11-7), 2011, p. 3)
Figure 3: IVV Methodology.
Input
This is the data, interfaces, import utilities, and sources that support the model. Here a review of the data standardization (free-formed text field standards), data normalization (date conversion from European to US), data validation controls (mandatory, required, and supplemental/referential), and reconciliation (all transactions from source are received and verified). Accounting of controls needs to be reviewed and evaluated. Do you have weak, moderate, or strong data controls? Examples:
Weak controls:
- Import test for formatting, field and record length checks;
- Type casting, text, number, monetary, float, date, alpha characters only, alpha numeric;
- Test for missing data in mandatory fields (field required to make the system operate); and
- Test for missing data in required fields (fields required for system to meet regulatory compliance).
Moderate controls:
- Test/validate date type, e.g. US, European;
- Text length test for possible truncation;
- Country code test 2 characters, 3 characters;
- Unexpected data values identified; and
- Records import match records in file (header information such as number of records in file…).
Strong controls:
- Field data test example 2 character country code meets standard ISO-2 country and validated on import;
- Product type id accurate\validated on import;
- Account type id accurate\validated on import; and
- First-time data values identification.
What about quantity, quality and appropriateness of data? Are the required fields 100% populated; is the data correct; and is it what the system needs to run properly? Is this the same for mandatory fields and referential fields? If these data elements are not met, is there an error log recording this information? There are a number of questions that need to be answered for input controls.
Process
This is where the bulk of the model or process happens; such as data translations for address parsing, calculated data, derived data, or any data manipulations within the model. The model will create aggregations, calculations, translations, thresholds and transformations of the data for the purpose of trending, detection of outliers, ratios, alerts, heat maps, in preparation of output. Here the questions are on the calculations, supporting data, thresholds, filtering criteria, aggregation assumptions, and calculated refining or learning.
Are your configurations in line with the business and risk tolerances; do these match your business assumptions; and have your thresholds and configurations been updated with the changes in business over time? Does your system still demonstrate outliers or is the false positive rate increasing at a disproportionate rate than volume? Do alerts fall in the 70th, 80th, or 90th percentile, what is appropriate for your risk tolerance? These are just some of the questions that need to be answered.
Output
Output is about results, what final data calculations are being presented? Did information flow through the model effectively and correctly? Are the outputs in-line with your expectations; do they mitigate risk; do the address business risk; and are they substantive?
The other factors to consider are presentation. You need to have good workflows and controls that drive compliance and standardization. Are your highest risks presenting in a priority order, is there aging, quality controls, and escalation points/processes?
Reports
Reporting must be clear and concise with good workflows and rollup reporting. These reports must ensure aggregation of risk at different levels. Systems can produce output in alert/case format or in report format, either is fine depending on volume. In reporting format, the identification of potential risk must be clearly demonstrated, for example, if you find that you are looking through records of cash deposits and adding them up to see if an individual has deposited more than $10,000 in a period of time, these reports may not be clear enough.
Figure 4: Sample timeline to complete an Independent Verification and Validation (IVV)
IVV Report
The IVV or model validation report should provide the following details: the baseline configurations, the analysis results from the observations, staff interviews conducted and the gaps between compliance expectations and actual results, and the review observations and recommendations. The report will be based on tested and observed result sets in the context of industry peers. The IVV report sections are based on three parts of the overall system: input, process, and output.
- Executive summary;
- Top recommendations;
- Demonstrate policies and procedure to system/process GAPs;
- Document data analysis (input);
- Document product/system/model analysis (process and output); and
- Provide Observations and Recommendations.
This is one example approach to a model validation report.
Vendor Due Diligence
In an independent verification and validation or model validation, the independent part is the segregation of duty or connection to the current process that you are looking to validate. Independence is measured by two factors:
- Distance – how far you are removed from the original project/model setup/changes.
- Time – how long since the vendor was involved in the project/model setup/changes.
A good rule of thumb is that the reviewer should not have been involved in the last setup/changes within the last 12 to 18 months, and should be:
- Independent;
- Knowledgeable about product or technology;
- Knowledgeable about compliance;
- Knowledgeable about the process;
- Knowledgeable about your lines of business; and
- Strong model validation methodology.
Conclusions
The entire process must be evaluated in the context of your policies and procedures, risk tolerance, business lines, industry peers, products and services. Remember, a strong model validation policy, is risk mitigation and supports a strong governance program.
References
Board of Governors of the Federal Reserve System Office of the Comptroller of the Currency. (2011, April 4). Supervision and Regulation Letters (SR 11-7a1). Retrieved May 1, 2014, from Board of Governors of the Federal Reserve System: http://www.federalreserve.gov/bankinforeg/srletters/sr1107a1.pdf
Board of Governors of the Federal Reserve System. (2011, April 4). Supervision and Regulation Letters (SR 11-7). Retrieved April 30, 2014, from Board of Governors of the Federal Reserve System: http://www.federalreserve.gov/bankinforeg/srletters/sr1107.htm
NASA. (2012, June, 6). Independent Verification and Validation Framework (IVV 09-1). retrieved 2014, November 7, http://www.nasa.gov/sites/default/files/ivv_09-1_-_rev_o.pdf
Wikipedia. Independent software verification and validation (2014). retrieved 2014, November 7, http://en.wikipedia.org/wiki/Independent_software_verification_and_validation
If you would like to know more about ARC Risk and Compliance, and our approach to an Independent Verification and Validation (IVV) or model validation, please contact us.