You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm currently attempting to run a project locally that utilizes a VLLM OpenAI compatible instance, but I've encountered a stumbling block. Whenever I try to initiate the project, I'm met with the following error message: "TypeError: init() got an unexpected keyword argument 'azure_endpoint'".
Had the same issue trying to use HF Pro with OpenAI API for Llama 3.1.
Fix it by:
Open site-packages/llmx/generators/text/openai_textgen.py and in OpenAITextGenerator add base_url to init and self.client_args. If needed, update the api.key default also here. Under generate -> oai_config, comment out top_p: config.top_p.
Save and the OpenAI api should work with other base_urls now (HF Pro, Ollama, etc).
I'm currently attempting to run a project locally that utilizes a VLLM OpenAI compatible instance, but I've encountered a stumbling block. Whenever I try to initiate the project, I'm met with the following error message:
"TypeError: init() got an unexpected keyword argument 'azure_endpoint'".
The code:
lida = Manager(text_gen=llm(provider="openai", api_base="http://192.168.1.254:8000/v1", api_key="EMPTY", models=model_details))
The text was updated successfully, but these errors were encountered: