Skip to content

Approaching to XAI interpreting Deep Neural Networks through a Decision Tree written in Prolog.

License

Notifications You must be signed in to change notification settings

SasageyoOrg/explainable-ai

Repository files navigation

Project logo

Explainable Artificial Intelligence

Jupyter Swi-Prolog UNIVPM GitHub


πŸ“ Table of Contents

πŸ“‹ About

This is an attempt to interpret Deep Neural Networks through a Decision Tree. The last one is a PROLOG implementation of a Decision Tree with multiple splitting criterias. The criteria used for measuring the goodness of split conditions are information gain, gain ratio and gini index.

Datasets sources:

Project's paper: Studio sull'interpretabilitΓ  della conoscenza di una Rete Neurale attraverso un Albero Decisionale

πŸ—‚ Project Topology

|-- data_preprocessing
|   |-- obesity/...
|   |-- obesity_cnn/...
|   |-- heart/...
|   |-- stroke/...
|   |-- create_database_elab.pl
|   |-- create_dataset.pl
|   |-- preprocessing.pl
|
|-- data
|   |-- obesity/...
|   |-- obesity_cnn/...
|   |-- heart/...
|   |-- stroke/...
|   
|-- dataset
|   |-- obesity_dataset.csv
|
|-- splitting_criteria
|   |-- gini_index.pl
|   |-- information_gain.pl
|   |-- gain_ratio.pl
|
|-- output
|   |-- matrix
|   |   |-- matrix_gini.txt
|   |   |-- matrix_gain.txt
|   |   |-- matrix_gainratio.txt
|   |   
|   |-- tree
|   |   |-- tree_gini.txt
|   |   |-- tree_gain.txt
|   |   |-- tree_gainratio.txt
|   
|   |-- neural_networks
|   |   |-- mlp.ipynb
|   |   |-- cnn.ipynb
|
|-- utility.pl
|-- writes.pl
|-- classify.pl
|-- tree_induction.pl

πŸ“Š Splitting Criteria

Gini Index

Information Gain

Gain Ratio

πŸ‘©β€πŸ’» Prolog Program Usage

Start SWI-Prolog:

$ swipl

Load the tree induction program:

[tree_induction].

Load the avaiable dataset:

load_dataset(<obesity|obesity_cnn|heart|stroke|stroke_ohe>).

Unload the dataset:

unload_dataset(<obesity|obesity_cnn|heart|stroke|stroke_ohe>).

Run all the tree inductions:

induce.

or run a single tree induction for each splitting criteria:

induce_tree(<gini|gain|gainratio>, Tree).

Print the Confusion Matrix:

confusion_matrix.

Load the data pre-processing program: (optional)

[data_preprocessing/preprocessing].

Run the pre-process program: (optional)

preprocess(<obesity|obesity_cnn|heart|stroke|stroke_ohe>).

πŸ”– Prolog Results

  • Obesity Dataset with CNN's target

    • Gini

      Performed tests: 253
      Unclassified tests: 3
      True negative  (TN): 120	 False positive (FP): 7
      False negative (FN): 9	 True positive  (TP): 114
      Accuracy (ACC): 0.936
      Error: 0.06399999999999995
      
    • Information Gain

      Performed tests: 253
      Unclassified tests: 3
      True negative  (TN): 118	 False positive (FP): 9
      False negative (FN): 11	 True positive  (TP): 112
      Accuracy (ACC): 0.92
      Error: 0.07999999999999996
      
    • Gain Ratio

      Performed tests: 253
      Unclassified tests: 3
      True negative  (TN): 123	 False positive (FP): 5
      False negative (FN): 9	 True positive  (TP): 113
      Accuracy (ACC): 0.944
      Error: 0.05600000000000005
      
  • Obesity Dataset with NN's target

    • Gini

      Performed tests: 253
      Unclassified tests: 7
      True negative  (TN): 128	 False positive (FP): 8
      False negative (FN): 6	 True positive  (TP): 104
      Accuracy (ACC): 0.943089430894309
      Error: 0.05691056910569103
      
    • Information Gain

      Performed tests: 253
      Unclassified tests: 9
      True negative  (TN): 129	 False positive (FP): 7
      False negative (FN): 5	 True positive  (TP): 103
      Accuracy (ACC): 0.9508196721311475
      Error: 0.049180327868852514
      
    • Gain Ratio

      Performed tests: 253
      Unclassified tests: 8
      True negative  (TN): 132	 False positive (FP): 3
      False negative (FN): 7	 True positive  (TP): 103
      Accuracy (ACC): 0.9591836734693877
      Error: 0.04081632653061229
      

⛏️ Built Using

✍️ Authors

πŸŽ‰ Acknowledgements