Logo image
The Role of Explainability In AI-Driven Financial Decision Making
 

The Role of Explainability In AI-Driven Financial Decision Making

Athina Ioannou, M.Mahdi Tavalaei, Dorthea Vatn Patrick Mikalef
Information Systems Frontiers, Vol.28(Special Issue: Innovative Applications and Ethical Implications of AI in the data sharing and financial analysis: Emerging Trends and Future Directions)
10/04/2026
Artificial intelligence Explainability Risk Information adoption Financial services
Artificial intelligence is playing a growing role in consumer financial services, yet many users lack the expertise to judge AI-generated advice and therefore rely on the explanations provided. The present study investigates how explanation quality, advisor type, and decision risk jointly shape users’ evaluation and adoption of financial recommendations. Building on the Information Adoption Model and explainable AI literature, we propose a framework where information quality, information usefulness, perceived trust, and perceived risk influence intentions to follow advice. Conducting a controlled online experiment, we also examine how these relationships may differ across advisor types (human, low-explainability AI, high-explainability AI) and decision risk levels (low, high). Results show that the effects of information usefulness, trust, and information quality differ systematically depending on advisor type, explanation clarity, and decision risk level. Our findings demonstrate that information adoption arises from the interaction of informational, relational, and contextual factors, extending the Information Adoption Model and offering insights for designing more effective AI–human financial advisory systems.
1
Logo image