-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add llm vendor Novita AI #25
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks good! Although, in its current state, clai wont be able to instanciate the Novita struct. There's two steps missing:
- Here the default Novita querier needs to be created. This is what defines the type of the generic
models.Querier
interface, it also ensures that the custom configuration is created for the vendor + that the custom configurations for the specific models gets properly overwritten. - Here the vendor, model and model version needs to be parsed, also for the configuration system to work.
See how ollama
is setup here for both steps. I'm thinking prefix novita:
will work well.
So, if this is done right, I'm thinking it should be possible to run:
clai -cm 'novita:gryphe/mythomax-l2-13b' query example123
Fixed and tested. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good and works well!
I couldn't get tooling to work, but by looking at the API docs, I'm not sure if the Novita API supports this yet?
Either way, it's an excellent start. Many thanks for the PR and contributions!
@@ -30,7 +30,7 @@ Nag on me to implement modellabs and I'll do it. | |||
- **Mistral API Key:** Set the `MISTRAL_API_KEY` env var to your [Mistral API key](https://console.mistral.ai/). [Text models](https://docs.mistral.ai/getting-started/models/) | |||
- **Ollama:** Start your ollama server (defaults to localhost:11434). Target using model format `ollama:<target>`, where `<target>` is optional (defaults to llama3). Reconfigure url with `clai s -> 1 -> <ollama-model-conf>` | |||
- **Glow**(Optional): Install [Glow](https://github.com/charmbracelet/glow) for formatted markdown output when querying text responses. | |||
- - **Novita AI:**(Optional) Set the `NOVITA_API_KEY` env var to your [Novita API key](https://novita.ai/settings?utm_source=github_clai&utm_medium=github_readme&utm_campaign=link#key-management). [Text models](https://novita.ai/model-api/product/llm-api?utm_source=github_clai&utm_medium=github_readme&utm_campaign=link). | |||
- **Novita AI:**(Optional) Set the `NOVITA_API_KEY` env var to your [Novita API key](https://novita.ai/settings?utm_source=github_clai&utm_medium=github_readme&utm_campaign=link#key-management). [Text models](https://novita.ai/model-api/product/llm-api?utm_source=github_clai&utm_medium=github_readme&utm_campaign=link). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll move this up so that it's after the Ollama in the readme post-merge
Novita AI does not support tool calling yet. |
Add new LLM vendor Novita AI