Explainable AI and trust: How news media shapes public support for AI-powered autonomous passenger drones

Public Underst Sci. 2024 Dec 9:9636625241291192. doi: 10.1177/09636625241291192. Online ahead of print.

Abstract

This study delves into the intricate relationships between attention to AI in news media, perceived AI explainability, trust in AI, and public support for autonomous passenger drones. Using structural equation modelling (N = 1,002), we found significant associations between perceived AI explainability and all trust dimensions (i.e., performance, purpose, process). Additionally, we revealed that the public acquired the perception of AI explainability through attention to AI in the news media. Consequently, we found that when the public pondered upon support for autonomous passenger drones, only the trust in performance dimension was relevant. Our findings underscore the importance of ensuring explainability for the public and highlight the pivotal role of news media in shaping public perceptions in emerging AI technologies. Theoretical and practical implications are discussed.

Keywords: XAI; autonomous passenger drones; explainable AI; perceived explainability; public opinion; trust in AI.