Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Usage of local LLM models #20

Open
renatokuipers opened this issue Oct 26, 2023 · 1 comment
Open

Usage of local LLM models #20

renatokuipers opened this issue Oct 26, 2023 · 1 comment

Comments

@renatokuipers
Copy link

renatokuipers commented Oct 26, 2023

I don't know where else to put this..

I was wondering if it would be possible to use local LLM models (like Mistral or Llama2) to write the functions through cataclysm.
While GPT-3 and GPT-4 are perfect for this, I can imagine that Local LLM models are a little bit more error-prone...

Is this something that is in the thoughtprocess of cataclysm or is this something that can't be done?

@renatokuipers
Copy link
Author

What would even be cooler, whould be that you can combine Cataclysm and AutoGen together.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant