MedicGo
Explainable AI meets persuasiveness: Translating reasoning results into behavioral change advice.
Metadata
Journalartificial intelligence in medicine4.383Date
2020 Mar 05
7 months ago
Type
Journal Article
Volume
2020-05 / 105 : 101840
Author
Dragoni M 1, Donadello I 2, Eccher C 3
Affiliation
Doi
PMIDMESH
Abstract
Explainable AI aims at building intelligent systems that are able to provide a clear, and human understandable, justification of their decisions. This holds for both rule-based and data-driven methods. In management of chronic diseases, the users of such systems are patients that follow strict dietary rules to manage such diseases. After receiving the input of the intake food, the system performs reasoning to understand whether the users follow an unhealthy behavior. Successively, the system has to communicate the results in a clear and effective way, that is, the output message has to persuade users to follow the right dietary rules. In this paper, we address the main challenges to build such systems: (i) the Natural Language Generation of messages that explain the reasoner inconsistency; and, (ii) the effectiveness of such messages at persuading the users. Results prove that the persuasive explanations are able to reduce the unhealthy users' behaviors.
Keywords: Explainable AI Explainable reasoning MHealth Natural Language Generation Ontologies
Fav
Like
Download
Share
Export
Cite
4.4
Artif Intell Medartificial intelligence in medicine
Metadata
LocationNetherlands
FromELSEVIER

No Data

© 2017 - 2020 Medicgo
Powered by some medical students