SHAP (SHapley Additive exPlanations)

A game theoretic approach to explain the output of any machine learning model, providing consistent and locally accurate explanations.
A game theoretic approach to explain the output of any machine learning model, providing consistent and locally accurate explanations.