You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi--we'd like to use Chalice for an OpenAI streaming project (streaming results back from the openai sdk using the streaming flag, ala how ChatGPT streams responses as the AI generates them). It looks like this is supported in the NodeJS runtime, but not so far in the Python runtimes.
There does seem to be a python api in boto to do turn this on: client.lambda.invoke_with_response_stream. Is there some limitation that would stop us from using this in Chalice?
thanks!
The text was updated successfully, but these errors were encountered:
Hi--we'd like to use Chalice for an OpenAI streaming project (streaming results back from the openai sdk using the streaming flag, ala how ChatGPT streams responses as the AI generates them). It looks like this is supported in the NodeJS runtime, but not so far in the Python runtimes.
There does seem to be a python api in boto to do turn this on: client.lambda.invoke_with_response_stream. Is there some limitation that would stop us from using this in Chalice?
thanks!
The text was updated successfully, but these errors were encountered: