How to use ner model outside spacy ? #7704
Replies: 2 comments 3 replies
-
I'm not aware of anyone having ever done this before and it's not really a usage pattern we support. If you're interested in getting at the weights it's not that complicated, you should look at how serialization works in Thinc, see https://github.com/explosion/thinc/blob/f92bb9c15933724e68547efd90648f26096df035/thinc/model.py#L439 Out of curiosity why and how do you want to use the NER model's weights outside spaCy? Since NER depends on the tokenization and a tok2vec component using it in isolation will be rather difficult. |
Beta Was this translation helpful? Give feedback.
-
Hi @polm having a similar issue here. We aim to deploy our classifiers from within Vespa (open source search application). Vespa allows text classification at semantic search time, and supports ONNX, Tensorflow, XGBoost etc formats. Spacy's API has been very convenient so far, but it's becoming hard to integrate with other AI services when there's no real support for the open neural exchange. Given ONNX is now a common standard for model interoperability, are there ways to produce a simple text classifier in ONNX format using Spacy? Have I missed the proper way to do this, or does Spacy just not natively support ONNX? |
Beta Was this translation helpful? Give feedback.
-
Hi,
I am using spacy to train ner model. I want to use the model outside spacy.
1.) Is it possible to convert the model to a pytorch model or any other format and then use it outside spacy?
2.) How can I get the model weights (for the model components in ner model and whole model )?
3.) Is it possible to get the component weights or extract the model weights and use it outside spacy using pytorch?
I also want to deploy the model on tensor RT to be able to run inference. Please let me know how can I do that as well?
Beta Was this translation helpful? Give feedback.
All reactions