The qii_tool package is an implementation of the QII method proposed in the paper "Algorithmic Transparency via Quantitative Input Influence: Theory and Experiments with Learning Systems". The original paper discusses the transparency-privacy tradeoff, whereas this particular package only exploits its transparency aspect to be used as an influence measures for interpretable machine learning.
Install qii_tool on your system using:
pip install qii_toolor clone the repository and run:
python setup.py bdist_wheel
python -m pip install dist/qii_tool-0.1.2-py3-none-any.whlFollowing examples can be found in the experiments:
- iris dataset
- digits dataset
The package is implemented in such a way that it is easy to extend to user's need. Following are several examples:
- set
evaluted_featuresinQII.compute()method to evaluate QII value for a selected set of features. QII.compute_unary_qii( si, S)can be used to compute Unary QII for features_iwith respect to a feature setS.- set
poolto a specific distribution, e.g. all data points has featuresepal length> 4.6 cm, inQII.compute()
- The code is adapted from Shayak Sen's version.
- For further request of case study or issues using the package, please contact Vinh (hovinh39@gmail.com).


