RNTI

MODULAD
Sur le pouvoir explicatif des arbres de décision
In EGC 2022, vol. RNTI-E-38, pp.147-158
Abstract
Decision trees are a learning model suitable for applications where the interpretability of decisions is of paramount importance. Here we examine the ability of binary decision trees to extract, minimize, and count abductive explanations and contrastive explanations. We prove that the set of all irredundant abductive explanations (alias sufficient reasons) of an instance can be of exponential size Therefore, generating the full set may turn out to be out of reach. Moreover, two sufficient reasons of the same instance can differ on all their attributes. Thus, the computation of a single sufficient reason only gives a fragmentary view of the possible explanations. We present the notions of necessary / relevant attribute for an explanation and the notion of explanatory importance of an attribute and we show that these notions can be useful to derive a synthetic view of the set of all sufficient reasons of an instance.