How to set context length #597
Unanswered
HeyZeus1232
asked this question in
Q&A
Replies: 2 comments
-
if you are using local models hosted by ollama you should modify the model file by increasing num_ctx as allowed by the model (ollama show [model name] in v1.45 will show you how large context the model was fined tuned for) then the local model will show up after you create it in the fabric --listmodels. at that point just call it with the --model input |
Beta Was this translation helpful? Give feedback.
0 replies
-
As soon as #1088 lands you can do it. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
HI, how do we set context length for local models?
There is currently an issue where local models will not follow any of the patterns, and output only summaries. I figured out it was a context length issue, even using a model with a large context size, same thing happens, except i tested this in WEB UI. I Copied and pasted a pattern into webui, ran it had same issue. Then i manually set the context length higher in the web ui settings, and now it work perfectly.
So need to be able to set context length manually for local models, is there a way to do this?
Beta Was this translation helpful? Give feedback.
All reactions