Building Explanations for Fuzzy Decision Trees with the ExpliClas Software

Fairness, Accountability, Transparency and Explainability have become strong requirements in most practical applications of Artificial Intelligence (AI). Fuzzy sets and systems are recognized world-wide because of their outstanding contribution to model AI systems with a good interpretability-accuracy tradeoff. Accordingly, fuzzy sets and systems are at the core of the so-called Explainable AI. ExpliClas is a software as a service which paves the way for interpretable and self-explainable intelligent systems. Namely, this software provides users with both graphical visualizations and textual explanations associated with intelligent classifiers automatically learned from data. This paper presents the new functionality of ExpliClas regarding the generation, evaluation and explanation of fuzzy decision trees along with fuzzy inference-grams. This new functionality is validated with two well-known classification datasets (i.e., Wine and Pima), but also with a real-world beer-style classifier.

keywords: Fuzzy Systems Software, Open Source Software, Software as a Service, Fuzzy Rule-based Systems, Explainable AI