Papers
arxiv:2009.13211

Instance-based Counterfactual Explanations for Time Series Classification

Published on Sep 28, 2020
Authors:
,
,

Abstract

A novel model-agnostic, case-based technique called Native Guide is introduced for generating counterfactual explanations for time series classifiers by adapting existing instances to highlight discriminative areas that influence classification decisions.

AI-generated summary

In recent years, there has been a rapidly expanding focus on explaining the predictions made by black-box AI systems that handle image and tabular data. However, considerably less attention has been paid to explaining the predictions of opaque AI systems handling time series data. In this paper, we advance a novel model-agnostic, case-based technique -- Native Guide -- that generates counterfactual explanations for time series classifiers. Given a query time series, T_{q}, for which a black-box classification system predicts class, c, a counterfactual time series explanation shows how T_{q} could change, such that the system predicts an alternative class, c'. The proposed instance-based technique adapts existing counterfactual instances in the case-base by highlighting and modifying discriminative areas of the time series that underlie the classification. Quantitative and qualitative results from two comparative experiments indicate that Native Guide generates plausible, proximal, sparse and diverse explanations that are better than those produced by key benchmark counterfactual methods.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2009.13211 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2009.13211 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2009.13211 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.