Banking Exchange Magazine Logo

Banking Algorithms, the Apple Card and Sexism

Allegation that the Apple Card provided husband a credit limit several times higher than his wife’s credit limit

  • |
  • Written by  Banking Exchange staff
Banking Algorithms, the Apple Card and Sexism

Banks continue to invest in algorithms made to do everything from prevent banking fraud to analyzing customer credit worthiness. However, Goldman Sachs is the latest financial institution that is facing a backlash on the results that an algorithm provides.

The New York Department of Financial Services has opened a financial investigation based on a customer’s allegation that the Apple Card provided him a credit limit several times higher than his wife’s credit limit despite filing joint tax returns and a significantly higher credit score by his wife. The story has taken off on social media.

According to David Hansson, the credit limit was more than twenty times his wife’s limit while she was also more diligent in paying off her credit limit. The algorithm was issued by Goldman Sachs.

The DFS has already made public that it has confirmed that they see significant evidence of sexual discrimination. Linda Lacewell, who works at the DFS stated, “The Department will be conducting an investigation to determine whether New York law was violated and ensure all consumers are treated equally regardless of sex.”

Hansson stated that he and his wife have been married for a number of years and that any appeals have fallen on deaf ears. Apple’s co-founder blamed the result on “big tech in 2019.” A Goldman Sachs employee, according to reports, also simply stated that it was the algorithm’s fault and nothing human that would be responsible for such a result.

To be fair to Goldman Sachs, the couple did not reveal any difference in personal income between Hanson and his wife which can be a major factor. Her low credit limit may have less to do with her credit worthiness than her own personal income and how much she spends relative to the figure.

That said, it does raise issues of relying completely on high tech without adding human considerations in analyzing a customer’s credit limits. The intention was not sexism in this case, as it would have likely happened in reverse as there is no evidence the algorithm itself factored in gender.

But at the very least, the case does show the limits to the accuracy of plugging in data into an equation without any human interaction.

back to top


About Us

Connect With Us