Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clarifai: Fixed model name error and streaming #4170

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

mogith-pn
Copy link
Contributor

Fixed model name error and streaming

Type

πŸ› Bug Fix
βœ… Test

Changes

  • Removed conversion of clarifai model name to lower case. Refer Clarifai community page for model names.
  • modified the stream wrapper function.
  • Added tests for streaming.

[REQUIRED] Testing - Attach a screenshot of any new tests passing locall

If UI changes, send a screenshot/GIF of working UI fixes
Screenshot 2024-06-13 at 4 54 46β€―PM

Copy link

vercel bot commented Jun 13, 2024

The latest updates on your projects. Learn more about Vercel for Git β†—οΈŽ

Name Status Preview Comments Updated (UTC)
litellm βœ… Ready (Inspect) Visit Preview πŸ’¬ Add feedback Jun 18, 2024 6:36pm

@mogith-pn
Copy link
Contributor Author

@ishaan-jaff ,
Thanks for #4158 , actually not all model names in clarifai follow same standards due to some naming limitation. so converting name into lowercase throws error for some models.
Also, Since We are using custom iterator for streaming, I have modified changes accordingly and as per krish's comment on #4102 added a test case for streaming as well. Kindly take a look into it.

@krrishdholakia
Copy link
Contributor

hey @mogith-pn, this doesn't solve the problem of async streaming not working

@mogith-pn
Copy link
Contributor Author

hey @mogith-pn, this doesn't solve the problem of async streaming not working

@krrishdholakia ,

  • Added Async streaming completion and associated test in test_streaming.py .
    Please review it and do the needful, as since the model name issue will be a huge blocker for users using liteLLM with clarifai.
    TIA :)

)
## RESPONSE OBJECT
try:
completion_response = response.iter_lines()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should be aiter_lines to return an async iterable

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My bad, Changed it to aiter and ran tests. It's running without any errors.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@krrishdholakia
Could you please take a loot at it ?
TIA :)

@@ -11129,7 +11129,7 @@ def handle_clarifai_completion_chunk(self, chunk):
completion_tokens = len(encoding.encode(text))
return {
"text": text,
"is_finished": True,
"is_finished": False,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

when is this ever finished? @mogith-pn

if you fake streaming isn't the first chunk also the last one?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

when is this ever finished? @mogith-pn

if you fake streaming isn't the first chunk also the last one?

when is this ever finished? @mogith-pn

if you fake streaming isn't the first chunk also the last one?

This is bit tricky situation, Previously when I added integration this was working fine (with 1.37.5 version).
But now it's throwing this below error when is_finished is set to True .

Attaching the logs here.

Give Feedback [/](https://vscode-remote+codespaces-002bdidactic-002dspace-002dpancake-002dr4px9j45g65f5v55.vscode-resource.vscode-cdn.net/) Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Provider List: https://docs.litellm.ai/docs/providers

Logging Details: logger_fn - None | callable(logger_fn) - False
Logging Details LiteLLM-Failure Call: []

Give Feedback [/](https://vscode-remote+codespaces-002bdidactic-002dspace-002dpancake-002dr4px9j45g65f5v55.vscode-resource.vscode-cdn.net/) Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Provider List: https://docs.litellm.ai/docs/providers

Logging Details: logger_fn - None | callable(logger_fn) - False
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
/workspaces/litellm-fork/litellm/utils.py in ?(self, chunk)
  11817             traceback_exception = traceback.format_exc()
  11818             e.message = str(e)
> 11819             raise exception_type(
  11820                 model=self.model,

KeyError: 'finish_reason'

During handling of the above exception, another exception occurred:

APIConnectionError                        Traceback (most recent call last)
/workspaces/litellm-fork/litellm/utils.py in ?(self)
  11942                 raise e
  11943             else:
> 11944                 raise exception_type(
  11945                     model=self.model,

/workspaces/litellm-fork/litellm/utils.py in ?(self, chunk)
  11817             traceback_exception = traceback.format_exc()
  11818             e.message = str(e)
> 11819             raise exception_type(
  11820                 model=self.model,
...
-> 9990             raise e
   9991         else:
   9992             raise original_exception

APIConnectionError: 'finish_reason'Give Feedback [/](https://vscode-remote+codespaces-002bdidactic-002dspace-002dpancake-002dr4px9j45g65f5v55.vscode-resource.vscode-cdn.net/) Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Provider List: https://docs.litellm.ai/docs/providers

Logging Details: logger_fn - None | callable(logger_fn) - False
Logging Details LiteLLM-Failure Call: []

Give Feedback [/](https://vscode-remote+codespaces-002bdidactic-002dspace-002dpancake-002dr4px9j45g65f5v55.vscode-resource.vscode-cdn.net/) Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Provider List: https://docs.litellm.ai/docs/providers

Logging Details: logger_fn - None | callable(logger_fn) - False
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
/workspaces/litellm-fork/litellm/utils.py in ?(self, chunk)
  11817             traceback_exception = traceback.format_exc()
  11818             e.message = str(e)
> 11819             raise exception_type(
  11820                 model=self.model,

KeyError: 'finish_reason'

During handling of the above exception, another exception occurred:

APIConnectionError                        Traceback (most recent call last)
/workspaces/litellm-fork/litellm/utils.py in ?(self)
  11942                 raise e
  11943             else:
> 11944                 raise exception_type(
  11945                     model=self.model,

/workspaces/litellm-fork/litellm/utils.py in ?(self, chunk)
  11817             traceback_exception = traceback.format_exc()
  11818             e.message = str(e)
> 11819             raise exception_type(
  11820                 model=self.model,
...
-> 9990             raise e
   9991         else:
   9992             raise original_exception

APIConnectionError: 'finish_reason'

Any idea why it happened ? @krrishdholakia

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants