Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for fully local deployment #10

Open
woop opened this issue May 30, 2023 · 2 comments
Open

Add support for fully local deployment #10

woop opened this issue May 30, 2023 · 2 comments

Comments

@woop
Copy link
Collaborator

woop commented May 30, 2023

Step one is #16. Then to add support for a local postgres/supabase. Ideally we can have a docker compose setup that runs in a completely isolated manner with only OpenAI calls being a managed dependency. Then we can introduce a local model (even if only for testing purposes).

We should consider LangChain as an abstraction to vector databases because they already have support for multiple stores.

@seanpmorgan
Copy link
Member

Commenting to say that the high level goal here to enable local development is a high priority task. We're going to move the server itself out so the Python SDK and Typescript SDK can be used locally with a vectordb (probably Chroma) and OpenAI calls / local LLM.

@dvirginz
Copy link

That would be a great capability. We are working in an isolated environment with only Azure OpenAI ports opened, and having the server run on-premises would help us a lot.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants