You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Just to clarify, since I didn't see it in the OP, when using the LocalAI provider with a base url that includes localhost, the error shown in the logs is:
INFO 2024-01-12 08:07:41,631 coordinator Actor _inputs1 has no dependencies. Sending BEGIN message
ERROR 2024-01-12 08:07:41,633 __init__ Exception occurred while processing
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/llmstack/common/blocks/llm/__init__.py", line 42, in process
return self._process(self.parse_validate_input(input), self.configuration)
File "/usr/local/lib/python3.10/dist-packages/llmstack/common/blocks/llm/openai.py", line 136, in _process
http_input = HttpAPIProcessorInput(
File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for HttpAPIProcessorInput
url
URL host invalid, top level domain required (type=value_error.url.host)
Describe the bug
Unable to add localAI to docker-compose and testing it since URL requires TLD
To Reproduce
Expected behavior
LLMstack should be able to communicate with local local-ai like
http://local-ai:8080
Version
v0.0.7
Environment
DISTRIB_DESCRIPTION="Linux Mint 21.2 Victoria"
Docker version 24.0.5, build ced0996
Docker Compose version v2.20.3
Screenshots
Additional context
The text was updated successfully, but these errors were encountered: