You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello @rivalscope! I'm here to help you with any bugs, questions, or contributions. Let's work together to resolve the issue.
There is a known issue related to the "Could not connect to Ollama API" error when using the Ollama Embeddings component in Langflow v1.1. The suggested solution is to ensure that the base_url is correctly set and that the Ollama API is running and accessible at that URL. The default base_url is http://localhost:11434, so you should verify that the Ollama API is running on your local machine at port 11434. If it's running on a different host or port, update the base_url in the OllamaEmbeddingsComponent configuration accordingly [1].
Additionally, a workaround mentioned is to use the LMStudio Embeddings component and route to /v1, which seems to work fine with Ollama and loads available models [2].
I am experiencing this as well and someone else posted a similar issue with a potential fix, by removing the temperature setting from the "code" function on the component.
Bug Description
It does not connect to API at all ( not working even hardcoded )
Workaround : Use LMStudio Embeddings component and route to /v1 ( work just fine with ollama even load available models)
For refference please see the image attached.
Reproduction
Try
Expected behavior
1 Connect to ollama server
2 Execute the embedding function
3 Respond with the enbedding vectors
Who can help?
No response
Operating System
ubuntu 22.04?! (langflow docker image)
Langflow Version
1.1
Python Version
None
Screenshot
No response
Flow File
No response
The text was updated successfully, but these errors were encountered: