Integrating Internet access for local LLMs #53
Closed
chaoticsoap
started this conversation in
Ideas
Replies: 1 comment 2 replies
-
You can already achieve online capability by using ex. Perplexity's online models: https://docs.perplexity.ai/docs/model-cards (this obviously isn't local though) I feel like online capability should exist at the LLM level like above, not my app's level. Maybe I'll change my mind on that someday. If it were in my app, why is llm-axe the best option? I'm sure there are a million other options which is another reason I hesitate :) |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Seems quite possible using llm-axe
Please include this!
simple script example:
Beta Was this translation helpful? Give feedback.
All reactions