Customized Endpoints For Proxies #404
jamsandwiches
started this conversation in
Ideas
Replies: 1 comment
-
hi @jamsandwiches, have you checked the Advanced settings? There are overrides for both chat and embedding models, could you try it and let me know if they work. Some reported CORS issues but I need more data points. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I'm currently looking at using your plug-in for my personal work vault and we host our LLMs via a proxy (hosted in GCP but connects to multiple endpoints & LLMs hosts like Azure/Vertex/Anthropic). It would be fantastic to be able to add our own endpoints, which would open up the ability for us to use endpoints such as proxies or self-hosted models. It would also require ensuring that the model being called is listed correctly to capture the right endpoint and I'm curious if this would be something that could be added in as enterable text, which is passed through or if there is a better solution.
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions