You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
TL;DR: Search assistant is not working properly with DISABLE_LITELLM_STREAMING config enabled.
I'm trying to use Onyx with DISABLE_LITELLM_STREAMING config enabled, since my custom model server doesn't support streaming yet. Although it is working normally for General assistant, the Search one is returning empty messages.
I've tested this behavior with both a custom model server and an OpenAI direct connection config, both of them have the same empty result.
If I disable DISABLE_LITELLM_STREAMING, the Search assistant work as expected.
Steps to reproduce:
Enable DISABLE_LITELLM_STREAMING through .env file when deploying Onyx with docker-compose
Config LLM with OpenAI Model
Try to use Search assistant
The text was updated successfully, but these errors were encountered:
marcelogdeandrade
changed the title
Search assistant returning empty message with DISABLE_LITELLM_STREAMING
Search assistant returning empty message with DISABLE_LITELLM_STREAMING enabled
Dec 17, 2024
Hello,
TL;DR: Search assistant is not working properly with
DISABLE_LITELLM_STREAMING
config enabled.I'm trying to use Onyx with
DISABLE_LITELLM_STREAMING
config enabled, since my custom model server doesn't support streaming yet. Although it is working normally forGeneral
assistant, theSearch
one is returning empty messages.I've tested this behavior with both a custom model server and an OpenAI direct connection config, both of them have the same empty result.
If I disable
DISABLE_LITELLM_STREAMING
, theSearch
assistant work as expected.Steps to reproduce:
DISABLE_LITELLM_STREAMING
through.env
file when deploying Onyx with docker-composeThe text was updated successfully, but these errors were encountered: