-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Compatibility with Transformers.jl #164
Comments
@adrhill this is related to JuliaTrustworthyAI/CounterfactualExplanations.jl#413 and perhaps a good first step towards integrating our systems a bit more 😄 |
Sorry for the late answer @ceferisbarov, @pat-alt, I caught a bad case of COVID and spent last week recovering from it!
This constraint is not intended. I dug into it and comes from overly strict type annotations in the XAIBase interface. I'll leave this issue open to track compatibility of ExplainableAI.jl with Transformers.jl. Do you have some specific use case you expected to work that you could share? |
I hope you are doing better now! Here is an example: using Transformers
using Transformers.TextEncoders
using Transformers.HuggingFace
using ExplainableAI
classifier = hgf"gtfintechlab/FOMC-RoBERTa:ForSequenceClassification"
encoder = hgf"gtfintechlab/FOMC-RoBERTa"[1]
analyzer = IntegratedGradients(classifier)
input = encode(encoder, "Hello, world!")
expl = analyze(input, analyzer)
from transformers_interpret import SequenceClassificationExplainer
cls_explainer = SequenceClassificationExplainer(
model,
tokenizer)
word_attributions = cls_explainer("I love you, I like you") |
Sorry to hear @adrhill, hope you've recovered by now |
Thanks, things are getting better! I'm addressing this issue by updating the ecosystem interface in Julia-XAI/XAIBase.jl#20. |
That was quick, thanks! I don't have anything else to add. I can use the new version and give feedback if I face any issues. Please, let me know if I can help in any other way. |
I just merged PR #166, which includes the changes from Julia-XAI/XAIBase.jl#20. |
Sorry, I am having laptop issues, so I won't be able to try it this week. To be clear, I am supposed to create a new analyzer, since the existing ones do not support Transformer models, right? |
Hi @ceferisbarov, I hope #176 clears up how to use the package. |
Transformers.jl models require
NamedTuple
input. ExplainableAI.jl analyzers require a derivative ofAbstractArray
. We can solve this by modifying XAIBase.jl and ExplainableAI.jl to support the Transformers.jl interface. I can start working on a PR if the maintainers are interested.The text was updated successfully, but these errors were encountered: