Context size #728
Replies: 2 comments
-
the context window size is in tokens not in characters. Which model are you using? You also need to pay attention to the chunk size and overlap. You can do that. On this line when you load the db, return that from the function and then do this: db.similarity_search(query). This will just return the relevant chunks without sending them through the LLM. Hope this helps. |
Beta Was this translation helpful? Give feedback.
-
hi @PromtEngineer , thanks so much for your response! |
Beta Was this translation helpful? Give feedback.
-
Hi, Ive noticed that CONTEXT_WINDOW_SIZE = 4096 is for character rather than tokens. However if i change it to 8000 i still only get the same ~4000 character context going to the LLM.
Does anyone who how to change this?
Also, is there a way to output the result from the langchain db search rather that it going straight to the LLM? i would like to use output the context and query to coPilot.
Many thanks
Beta Was this translation helpful? Give feedback.
All reactions