Skip to content

sbobek/acfx

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

137 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ACFX — Actionable Counterfactual eXplainer

PyPI version Python versions License Documentation Status GitHub Stars

ACFX (Actionable Counterfactual eXplainer) is a model-agnostic Explainable AI (XAI) framework for generating actionable counterfactual explanations for machine learning models.

It answers the question:

What minimal and feasible changes to the input would lead to a desired prediction outcome?

Key Features

  • Model-agnostic counterfactual explanations
  • Actionability and feasibility constraints
  • Support for causal structures and expert knowledge
  • Built-in benchmarking framework
  • Python API and graphical user interface (GUI)

Installation

From PyPI

conda create --name acfx_env python=3.11
conda activate acfx_env
conda install pip
pip install acfx

From source

git clone https://github.com/sbobek/acfx
cd acfx/src
pip install .

Quick Start

from acfx import AcfxEBM
from interpret.glassbox import ExplainableBoostingClassifier
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
import numpy as np

# Initialize model and explainer
model = ExplainableBoostingClassifier()
explainer = AcfxEBM(model)

# Load sample data
data = load_iris(as_frame=True)
X_train, X_test, y_train, y_test = train_test_split(data.data, data.target, test_size=0.2, random_state=42)

# Define feature bounds
pbounds = {col: (X_train[col].min(), X_train[col].max()) for col in X_train.columns}

# Example adjacency matrix and causal order
adjacency_matrix = np.array([
    [0.0, 0.0, 0.0, 0.0],
    [0.8, 0.0, 0.0, 0.0],
    [0.0, 0.6, 0.0, 0.0],
    [0.5, 0.0, 0.7, 0.0]
])
causal_order = [0, 1, 2, 3]

# Fit explainer
explainer.fit(
    X=X_train,
    adjacency_matrix=adjacency_matrix,
    causal_order=causal_order,
    pbounds=pbounds,
    y=y_train,
    features_order=X_train.columns.tolist()
)

# Generate counterfactual
query_instance = X_test.iloc[0].values
original_class = model.predict([query_instance])[0]
cf = explainer.counterfactual(desired_class=original_class, query_instance=query_instance)
print(cf)

⚠️ The adjacency matrix above is a simple example. In practice, it can be provided by expert knowledge or learned using tools like DirectLiNGAM.

Usage Tutorials

Learn more and explore advanced examples:

Benchmarking

Install dependencies

pip install acfx[benchmark]

Run benchmark

git clone https://github.com/sbobek/acfx
cd acfx/src
python -m acfx.benchmark.main

⚠️ Always run as a module from acfx/src using python -m. Running directly or from another folder may cause import or path errors.

Citation

TBD

About

Actionable Counterfactual eXplanations

Topics

Resources

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •