Replies: 4 comments
-
If we could do something like LangChain does, then it will be just a matter of installing the models separately and just using the llm from langchain import PromptTemplate, LLMChain from langchain.llms import GPT4All or template = """Question: {question} langchain supports a large number of models today, link below How easy will it be for us to use langchain as a project dependency and use |
Beta Was this translation helpful? Give feedback.
-
this repo seams to make this much easier to implement. it provides an API interface as a backend for any local LLM |
Beta Was this translation helpful? Give feedback.
-
With the newer coding fine-tuned models coming out, this is going to become more important very soon. |
Beta Was this translation helpful? Give feedback.
-
I can code once we have finalized the approach |
Beta Was this translation helpful? Give feedback.
-
As suggested creating a discussion thread so that we could discuss and come up with a solution to support any LLM in this wonderful gpt-engineer
Beta Was this translation helpful? Give feedback.
All reactions