-
Notifications
You must be signed in to change notification settings - Fork 419
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
in keys.ts,I need full three ? #35
Comments
因为你本地没有 ollama 模型,接口请求的时候失败所以没响应,你可以安装其他模型如:ollama3,不一定是 moondream:1.8b-v2-fp16 模型 |
对的,我本地是没有模型的,问别人要的的地址。我以为这样是可以的。那我现在去本机安装一个模型 |
请问楼主按照步骤成功了吗,我也遇到的相同问题,进入那个界面有哪些任务命令可以使用,不是任意的吗 |
解释图片需要多模态的模型,也就是具备vision的,例如llava-llama3之类 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
export const keys = {
groq: '',
ollama: 'http://localhost:11434/api/chat',
openai: ''
};
The text was updated successfully, but these errors were encountered: