Replies: 5 comments
-
not yet, can you share docs and maybe someone can try to support this |
Beta Was this translation helpful? Give feedback.
-
hey @kingennio - litellm maintainer here - how do you want to use batch completions with instructor? |
Beta Was this translation helpful? Give feedback.
-
Hi there, thank you for your interest. |
Beta Was this translation helpful? Give feedback.
-
Hi there. Currently
Results in -
Im assuming this is because handling batch completions (which returns |
Beta Was this translation helpful? Give feedback.
-
Would it work if you call directly first with liteLLM and then use instructor for the parsing in a 2nd call (even perhaps with a different model)? Idea from here:
|
Beta Was this translation helpful? Give feedback.
-
Hi there. I was trying to use instructor with liteLLM. I can make il work for standard chat completions, as shown in the example, but I was trying to exploit batching supported by liteLLM through their batch_completion fanction. But I'm unable to make it work with instructor. I'm I right that this feature cannot be exploited with instructor? Thanks
Beta Was this translation helpful? Give feedback.
All reactions