Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Ollama model parameters mirostat-eta and mirostat-tau #3345

Open
2 tasks done
sealad886 opened this issue Dec 12, 2024 · 0 comments · May be fixed by #3348
Open
2 tasks done

Add support for Ollama model parameters mirostat-eta and mirostat-tau #3345

sealad886 opened this issue Dec 12, 2024 · 0 comments · May be fixed by #3348
Assignees
Labels
area:configuration Relates to configuration options kind:enhancement Indicates a new feature request, imrovement, or extension "needs-triage"

Comments

@sealad886
Copy link
Contributor

Validations

  • I believe this is a way to improve. I'll try to join the Continue Discord for questions
  • I'm not able to find an open issue that requests the same enhancement

Problem

The config.json schema supports the mirostat completion option for Ollama models, but not the other configuration parameters that are paired with it. The Ollama model parameters finally caught up to llama.cpp and now support more of the configuration parameters both in their Modelfile parsing and in their API.

Ollama API options
Ollama Modelfile params

{
    "model": "AUTODETECT",
    "title": "Ollama",
    "provider": "ollama",
    "contextLength": 32768,
    "capabilities": {
      "uploadImage": true
    },
    "completionOptions": {
      "temperature": 0.05,
      "stream": true,
      "mirostat": 2,
      # "mirostatEta": 0.1,      <-- Influences how quickly the algorithm responds to feedback from the generated text.
      # "mirostatTau": 5.0,      <-- Controls the balance between coherence and diversity of the output.
      "numThreads": 0,
      "topP": 0.95
    },
    "systemMessage": "You are a top-notch software engineer, and you are helping to develop and maintain complex, important software."
}

Additionally, it looks like someone already added support in the Ollama interface:

interface ModelFileParams {
mirostat?: number;
mirostat_eta?: number;
mirostat_tau?: number;
num_ctx?: number;
repeat_last_n?: number;
repeat_penalty?: number;
temperature?: number;
seed?: number;
stop?: string | string[];
tfs_z?: number;
num_predict?: number;
top_k?: number;
top_p?: number;
min_p?: number;
// deprecated?
num_thread?: number;
use_mmap?: boolean;
num_gqa?: number;
num_gpu?: number;
}

Solution

It really should just be a matter of updating the schema.ts definition:

completionOptions: z
.object({
temperature: z.number().optional(),
topP: z.number().optional(),
topK: z.number().optional(),
minP: z.number().optional(),
presencePenalty: z.number().optional(),
frequencyPenalty: z.number().optional(),
mirostat: z.number().optional(),
stop: z.array(z.string()).optional(),
maxTokens: z.number().optional(),
numThreads: z.number().optional(),
useMmap: z.boolean().optional(),
keepAlive: z.number().optional(),
raw: z.boolean().optional(),
stream: z.boolean().optional(),
})

@dosubot dosubot bot added area:configuration Relates to configuration options kind:enhancement Indicates a new feature request, imrovement, or extension labels Dec 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:configuration Relates to configuration options kind:enhancement Indicates a new feature request, imrovement, or extension "needs-triage"
Projects
None yet
2 participants