Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama Embeddings is faulty in v1.1 : Error building Component Ollama Embeddings: Could not connect to Ollama API. #4728

Open
rivalscope opened this issue Nov 20, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@rivalscope
Copy link

Bug Description

  • It does not connect to API at all ( not working even hardcoded )

  • Workaround : Use LMStudio Embeddings component and route to /v1 ( work just fine with ollama even load available models)

For refference please see the image attached.

Screenshot 2024-11-19 at 23 57 56

Reproduction

Try

  • Fill as usual the model name and base url.
  • Hardcode the model name and base url.

Expected behavior

1 Connect to ollama server
2 Execute the embedding function
3 Respond with the enbedding vectors

Who can help?

No response

Operating System

ubuntu 22.04?! (langflow docker image)

Langflow Version

1.1

Python Version

None

Screenshot

No response

Flow File

No response

@rivalscope rivalscope added the bug Something isn't working label Nov 20, 2024
Copy link

dosubot bot commented Nov 20, 2024

Hello @rivalscope! I'm here to help you with any bugs, questions, or contributions. Let's work together to resolve the issue.

There is a known issue related to the "Could not connect to Ollama API" error when using the Ollama Embeddings component in Langflow v1.1. The suggested solution is to ensure that the base_url is correctly set and that the Ollama API is running and accessible at that URL. The default base_url is http://localhost:11434, so you should verify that the Ollama API is running on your local machine at port 11434. If it's running on a different host or port, update the base_url in the OllamaEmbeddingsComponent configuration accordingly [1].

Additionally, a workaround mentioned is to use the LMStudio Embeddings component and route to /v1, which seems to work fine with Ollama and loads available models [2].

To continue talking to Dosu, mention @dosu.

@greenmotion
Copy link

I am experiencing this as well and someone else posted a similar issue with a potential fix, by removing the temperature setting from the "code" function on the component.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants