You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I ran Tabby with an HTTP OpenAI-compatible chat model running remotely with vLLM. In the chat, Tabby kept sending a nonstop answer, like the following image.
Information about your version
0.18.0
Additional context
Log from Tabby:
tabby-1 | The application panicked (crashed).
tabby-1 | Message: index out of bounds: the len is 0 but the index is 0
tabby-1 | Location: ee/tabby-webserver/src/service/answer.rs:173
tabby-1 |
tabby-1 | Backtrace omitted. Run with RUST_BACKTRACE=1 environment variable to display it.
The text was updated successfully, but these errors were encountered:
Describe the bug
I ran Tabby with an HTTP OpenAI-compatible chat model running remotely with vLLM. In the chat, Tabby kept sending a nonstop answer, like the following image.
Information about your version
0.18.0
Additional context
Log from Tabby:
The text was updated successfully, but these errors were encountered: