Tom Hinkel, Director of Compliance

First off, a big thanks to all of you who participated in the recent compliance examination experience survey.  We received a total of 80 responses this time, including 20 non-Safe Systems customers.  Most of the results were somewhat predictable (80% of you had the FDIC as your primary federal regulator), others were rather surprising.  Here are some of the results, as well as my observations.

Demographics:

Financial institution respondents represented a total 20 states, so we captured a good geographic cross-section of examination experience.  Although most were FDIC institutions, we also had institutions from all the other federal regulators as well.  85% of you are under $500M in asset size, although we also got 4 that were over $1B.  75% were more than 5 years old, and 53% saw your examiner within the last 6 months.

Larger vs. smaller:

One thing I found interesting about the larger institutions is that they seemed to enjoy an overall better examination experience. They spent a little less time on average preparing, felt that things went pretty much as expected, generally got along with their examiner, and scored well.  They also challenged examination findings more often, and when they did, they were more successful in getting them removed.  However, none of them used outside help to prepare, and even though they scored well on IT, it was slightly below the average (2.00 versus 1.82).  Also, none saw their score get better this time around, and 1 saw a decrease.  So larger institutions may have had a better than average examination perception, but they actually had slightly below average results as measured by IT composite scores.

Newer vs. older:

Not surprisingly, newer institutions were smaller, with 95% under $500M. They were also far more likely to use outside consultants to help with exam prep and responses.  85% felt you were highly prepared going into your last exam, and although your IT score averaged higher (2.11), twice as many of you improved your IT composite score than saw it decline.  I attribute the higher scores to the fact that newer institutions are (A) generally scrutinized more closely, and (B) many of their policies and procedures are still evolving. A + B = more examination findings.

Safety & Soundness:

The average Safety and Soundness (S&S) composite score was 2.46.  Most (79%) had no change since their last exam, but perhaps reflecting the ongoing weakness in asset quality, more than twice as many saw that score get worse than saw it get better.  IT is one of the components in the S&S score, so better IT scores will help the overall composite, but it’s clear that asset quality issues will continue to drag S&S scores down for the immediate future.

Preparation:

As far as all responses are concerned, most institutions spent quite a bit of time preparing for their last examination.  57% of you spent more than 5 hours, but interestingly enough, it really didn’t translate into better results.  Although 73% of those felt they were very prepared for the exam, less than half felt that the exam went pretty much as expected, with 9% describing their last examination as a “nightmare”!  By contrast, only 5% of those who spent less than 5 hours preparing felt the same way.  But perhaps the most significant statistic is the average IT composite score.  Those who spent more than 5 hours preparing averaged a score of 1.85 as opposed to a 1.76 for those that spent less than 5 hours.  So is the conclusion that as far as preparation goes, less equals more?  No, I think a better way to interpret the data is that it’s better to work smarter than harder.  This leads to my next observation:

Outside assistance:

For all institutions responding, those of you who used an outside consultant to assist with the pre-examination questionnaire seemed to have a much more favorable experience overall.  90% of you felt that the examination experience was either not bad, or pretty much as expected.  But more significantly, those who used outside help also got better IT composite scores, averaging a 1.69 versus 1.82 for all respondents!

The Safe Systems advantage

When results were filtered for Safe Systems customers, a couple of things stood out.  45% of you spent more than 5 hours preparing for your last exam (versus 63% of non-customers), 80% of you had outside help, and you generally had fewer difficulties (77% reported “no problems” vs. 65%).  So you spent less time preparing, relied more on outside help, were more confident, had fewer difficulties with the exam and examiner, and  achieved similar results to those who spent more time preparing, did it all themselves, were less confident going in, and had more problems.

One statistic I like to see changed for the next survey is the percentage of customers that “push back” and challenge examiner findings.  Only 39% of all respondents challenged a finding, but of those who did, 71% were successful getting the finding removed from the final report.  Since there is no penalty for an unsuccessful challenge, and a very real benefit to having it removed, I’d like to see findings challenged whenever possible.  The keys to a successful challenge are:

  1. Understanding the root cause of the finding, and
  2. Convincingly presenting your case that you understand the reason for the finding, and believe you’ve adequately addressed it.

With access to both dedicated regulatory compliance resources and essential documentation, Safe Systems customers are uniquely prepared for this challenge.  Armed with NetComply reports, the Quarterly Self-Assessments (guided by your Account Manager), and Annual System Review meeting minutes you should have ample (and convincing) documentation.  And as always, if the compliance department can assist in any way with your preparation or response efforts, let us know!

 

Write a Comment