Article

Lending Perspectives: 5 Lessons for Credit Unions From Apple Card’s Early Missteps

Apple card next to smartphone
Bill Vogeney Photo
Chief Revenue Officer
Ent Credit Union

5 minutes

It didn’t take long for the new Apple card to make the news for the wrong reasons.

None other than Steve Wozniak, the co-founder of Apple, questioned whether the algorithms used to automatically decision applications discriminated against his wife based on sex. David Hansson, another well-known tech executive, made the same claims on Twitter after his wife also received a much lower limit on her Apple card even though her FICO score is supposedly higher than his. Is this controversy just a short-lived social media mess or is this part of a slowly growing mistrust of artificial intelligence?

Because Goldman Sachs does not seem to have encountered any real challenges in approving billions in new credit lines, perhaps the more important question to ask is, “What can credit unions learn from this initial backlash towards the Apple card?”

1. Machine learning or artificial intelligence is a two-edged sword.

While many credit unions are sort of mired in a slump, making automated decisions on only 20-30% of their consumer loan applications, Goldman Sachs is likely providing an automatic, immediate decision on virtually 100% of these applications; I don’t think Apple would stand for anything less from a customer experience standpoint. There are serious limitations to the traditional decision engines most credit unions use as part of our loan origination systems, the most prevalent being the fact that utilizing even just 20 pieces of data seems to be the limitation for most. Artificial intelligence has no such limitations. If you have access to 10,000 pieces of data, you can factor all of them into the loan decision. Yet how do you manage that process? And how you can ensure that the algorithms in artificial intelligence systems aren’t learning to discriminate based on prohibited or perhaps uncomfortable reasons? It seems that credit unions venturing into machine learning and artificial intelligence need to focus on vendors that understand lending and the Equal Credit Opportunity Act.

2. The quality of your decision is dependent on the quality of your information.

Steve Wozniak also pointed out that he and his wife file joint tax returns and live in a community property state. Is your credit union in compliance when lending in these states? Are you allowing borrowers to use their entire household incomes? Furthermore, I’m in the dark when it comes to what Goldman Sachs is requiring of credit card applicants: Are they requiring applicants to download their tax returns? If they aren’t, who knows whether Steve and his wife disclosed the same amount of income?

3. The more complex the application decision process becomes, the better prepared we have to be to explain the criteria.

A decade ago, FICO scores ruled. Now it’s not just the FICO score driving decisions; alternative data is growing in prominence. Where someone went to college, if they received a degree or not, whether their phone is a pre-paid phone—all of this type of data is fair game with fintech lenders. I recall reading about a new lender a few years ago that determined through its machine learning that borrowers who applied and entered their answers in all capital letters were of lesser credit quality than their population of borrowers as a whole. Is this something we want to explain to a borrower? On a related note, if your user interface allows the consumer to enter all caps, it doesn’t speak highly of the experience. I wrote about the user interface in a past column.

4. Virtually all consumers, even well-educated ones like Steve Wozniak, really are uneducated as to how lenders really make decisions.

The only thing the consumer knows is the FICO score and the pursuit of a higher one. Even some of my peers may only understand that FICO as a whole only explains maybe 60% of borrower performance. Understanding how lenders assign credit limits utilizing specific credit bureau attributes, it’s reasonable to believe that Goldman Sachs looked at the history of specific line amounts for both Mr. and Mrs. Wozniak. Let’s assume Steve may have a credit card with a very high line, perhaps $250,000 or more. Does his wife have the same history? What was her previous high credit? When it comes to financial education and credit scores, we often focus on line utilization, length of credit history, amount of credit inquiries and payment history. Yet we rarely talk about how your history of handling a certain amount of credit dictates our line decisions. A borrower whose largest line is $5,000 rarely will get a $50,000 line. We need to better educate our members on some of these finer points of lending.

5. It still pays to know your member.

At this point in time, I don’t believe any level of artificial intelligence at any lender can recognize who Steve Wozniak is. Google can provide lots of information on Steve, but how will any lender incorporate that into loan decisions? Whether it’s a human underwriter making a decision or your decision engine incorporating checking account history and deposit balance history into the decision, we have a lot of information on our members. Let’s make sure we’re leveraging that to utmost.

Going back to my original thought that the early controversy is part of a growing mistrust of artificial Intelligence, I wonder if the pursuit of a 100% automated decision scenario is even desirable? Members may not care that a computer approved their loan; however, my sense is they want to talk to the person that denied their loan request.

CUES member Bill Vogeney is chief revenue officer and self-professed lending geek at $6 billion Ent Credit Union in Colorado Springs, Colorado.

Compass Subscription