LLMs support 🤖 #544
assafelovic
announced in
Announcements
Replies: 1 comment
-
Hi, I'm testing phi3 with Ollama in local. I have to say the results are pretty good on my old computer. Not as fast as Gpt4o, but to the point except for "Ressource report" that has some odd results. Need futher testing to see the patern. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Excited for yet another HUGE release that includes LLM support for the following: Llama3, Mistral, Anthropic, HuggingFace, Together AI, Gemini and more!
We've also updated our docs including examples:
This release also includes LangGraph deployment by @hwchase17 and additional stability improvements. Thank you to all the amazing contributions!
What's Changed
New Contributors
Full Changelog: v0.2.3...v0.2.4
This discussion was created from the release LLMs support 🤖.
Beta Was this translation helpful? Give feedback.
All reactions