Skip to content

Update init_expainer #87

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 22 additions & 6 deletions src/explanation/shap_wrapper.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,27 @@
from src.encoding.common import retrieve_proper_encoder, get_encoded_logs
from src.encoding.models import ValueEncodings
from src.explanation.models import Explanation


def _init_explainer(model):
return shap.TreeExplainer(model)

from src.predictive_model.classification.models import ClassificationMethods


def _init_explainer(model, df, model_type: str = None):
"""
Initialises the explainer according to the model type
:param model: model to explain
:param df: model training data
:param model_type: model type
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

rewrite in:
:param model_type: one of ClassificationMethods enumerator

:return: shap explainer corresponding to the model
"""
if model_type in [ClassificationMethods.RANDOM_FOREST.value,
ClassificationMethods.DECISION_TREE.value,
ClassificationMethods.XGBOOST.value,
ClassificationMethods.ADAPTIVE_TREE.value,
ClassificationMethods.HOEFFDING_TREE.value]:
return shap.TreeExplainer(model)
if model_type in [ClassificationMethods.PERCEPTRON.value,
ClassificationMethods.NN.value]:
return shap.DeepExplainer(model, df)
return shap.KernelExplainer(model)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you sure the shap.KernelExplainer function takes only 'model' as parameter?


def _get_explanation(explainer, target_df):
return explainer.shap_values(target_df)
Expand All @@ -21,7 +37,7 @@ def explain(shap_exp: Explanation, training_df, test_df, explanation_target, pre
model = model[0]
prefix_int = int(prefix_target.strip('/').split('_')[1])-1

explainer = _init_explainer(model)
explainer = _init_explainer(model, training_df)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The explanation object contains a reference to the predictive_model, please pass to the function you customised the 'prediction_method' string contained in the PredictiveModel, otherwise your code will never be executed.

target_df = test_df[test_df['trace_id'] == explanation_target].iloc[prefix_int]
#if explanation_target is None:
# shap_values = explainer.shap_values(test_df.drop(['trace_id', 'label'], 1))
Expand Down