Externalize the LLM Prompts. #553
DanHUMassMed
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I would like to suggest a new feature. Externalize the LLM Prompts.
A number of User Stories I see positively impacted are:
Some Ideation on Implementation details.
prompts.json
), and no config variable is needed.Discussion/Ideas
Thanks,
Dan
Beta Was this translation helpful? Give feedback.
All reactions