A right to explanation for algorithmic credit decisions in the UK

Alison T. Lui, George Lamb, Lola Durodola

    Research output: Contribution to journalArticlepeer-review

    1 Downloads (Pure)

    Abstract

    This article argues for a statutory right to explanation in automated credit decision-making in the UK, as transparency and accountability are central to the rule of law. First, from a moral standpoint, we demonstrate that there is a double level of distrust in financial services and algorithms. Algorithms are unpredictable and can make unreliable decisions. Algorithmic challenges such as bias, discrimination and unfairness are exacerbated by the opacity problem commonly known as the ‘black box’ phenomenon. The informed consent process in automated credit decision-making is thus incomplete, which requires an ex-post right to explanation for completing the informed consent procedure. Secondly, our doctrinal and comparative legal methodologies reveal that countries such as the USA, Canada, European Union, China and Poland already provide a right to explanation to credit applicants under certain circumstances. We also present new empirical evidence of a public desire to have a right to explanation for unsuccessful credit applications.
    Original languageEnglish
    Pages (from-to)289-317
    Number of pages29
    JournalLaw, Innovation and Technology
    Volume17
    Issue number1
    Early online date27 Feb 2025
    DOIs
    Publication statusE-pub ahead of print - 27 Feb 2025

    Bibliographical note

    Open access CC-BY

    Keywords

    • Automated credit lending
    • algorithmic bias
    • algorithmic opacity
    • explainable artificial intelligence
    • trust
    • right to explanation

    Fingerprint

    Dive into the research topics of 'A right to explanation for algorithmic credit decisions in the UK'. Together they form a unique fingerprint.

    Cite this