You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Some stop sequence inputs (e.g. "}) trigger an error:
Prediction failed.
E2102 TritonTokenizerError: Tokenizer error: in ensemble 'ensemble', Failed to process the request(s) for model instance 'preprocessing_0_126', message: ValueError: To standardize tokenizer behavior, we prepend '!' to the string representation of each stop sequence. We then strip the corresponding first token from the stop sequence IDs. However, the first token of the stop sequence IDs was not '{arbitrary_start_sequence_id}', which suggests there is a problem with the tokenizer that you are using. At: /src/triton_model_repo/preprocessing/1/model.py(287): _to_word_list_format /src/triton_model_repo/preprocessing/1/model.py(182): execute
Expected Behavior
All stop sequence inputs should be handled and applied, such that generation stops when those sequences are encountered.
Reproduce
These request against llama-3-70b triggers the error reliably:
Observed Behavior
Some stop sequence inputs (e.g.
"}
) trigger an error:Expected Behavior
All stop sequence inputs should be handled and applied, such that generation stops when those sequences are encountered.
Reproduce
These request against llama-3-70b triggers the error reliably:
https://replicate.com/p/1w0ht542kdrgj0cg7c2vpkr4a0
This request ran against:
The text was updated successfully, but these errors were encountered: