There is no doubt that paying for items with the touch of a button has added a layer of convenience to our daily lives that we could only have dreamed of in the past. Gone are the days of wondering about our credit balance or long wait periods for credit card applications and limits. Now our cards are safely stored in our smartphone wallet, ready whenever we need them.
However, what happens when a new digital tool is revealed to have flaws like alleged discrimination or bias? This was the case of a partnership between Apple and Goldman Sachs. Worrying credit lending decisions were called into question for Apple’s payment card, underwritten by Goldman Sachs. Among reported issues, gender discrimination was alleged in the awarding of credit card limits. While not the newest example, it remains a highly illustrative one for the danger of erosion in trust in digital advances and lending decisions.
The issue of transparency
Apple’s credit credit lending decisions entered a full scale media storm when the company’s co-founder Steve Wozniak confirmed gender discrimination involving his spouse who was actively using the payment card. Users received an instant response to their application and a credit limit. Fast and efficient yes. Reliable and without bias? The answer is not so clear. Wozniak, along with other users came to quickly realise that convenience can come at a high price to trust.
Consumers need clarity to trust digital services
In order to provide the advertised convenience and speed in handling requests, companies apply various algorithms. Banks have experience, and crucially extensive data on credit applications. Added to this, they draw upon actuarial data on how likely an applicant is to be able to service a credit line. Validating credit applications seems like a clear cut use-case for machine learning systems.
However, these systems are a black box. It’s not clear to outside observers, or even employees who are working with algorithms why certain decisions are taken. Employees or managers may also not be able to override certain decisions. Unsurprisingly, this opens the possibility of discrimination.
Lack of clarity. Lack of trust
Without a clear understanding of the criteria that has led to a certain decision being taken, it is clear that customers do not trust such a service to arrive at fair conclusions. Not only is this situation bad for the customers facing discrimination, it also causes considerable reputational damage for the lending instituion or organisation in question. It does not pay to rely on opaque systems due to their supposed increased efficiency or convenience.
Working towards robust digital trust
Today we are surrounded by digital services and tools making our lives better and more convenient. However, for truly successful digitalisation, we need digital trust and transparency. Creating transparency and showing customers at which point automated decision-making might take place can alleviate concerns and promote trust.
This is precisely what the Swiss Digital Initiative is doing with the Digital Trust Label. Organisations will be able to have their digital service certified according to various criteria, including transparency.