mindspore implementation of transformers
git clone https://github.com/lvyufeng/bert4ms
python setup.py install
from cybertron import BertTokenizer, BertModel
from cybertron import compile_model
tokenizer = BertTokenizer.load('bert-base-uncased')
model = BertModel.load('bert-base-uncased')
# get tokenized inputs
inputs = tokenizer("hello world")
# compile model
compile_model(model, inputs)
# run model inference
outputs = model(inputs)
MindSpore has already provide the implementation of SOTA models in ModelZoo, but all checkpoints are trained from scratch which is not faithful. Since Transformers has become a convenient toolkit to finish research or industry tasks, I develop this tool to transfer the checkpoint with code from huggingface to MindSpore. You can use it as same as Transformers to develop your own pretrained or finetuned models.