Skip to content

Latest commit

 

History

History
20 lines (14 loc) · 458 Bytes

README.md

File metadata and controls

20 lines (14 loc) · 458 Bytes

OPENAI GRADIO UI WITH STREAM RESPONSE.

CAN BE USED WITH OLLAMA, OPENAI and other Private and Public LLM providers.

Simple GRADIO UI that runs local, so you can interact with your specific LLM.

This is a work in progress - check back regularly for updates

** First run Ollama locally with your LLM - mistral, codellama, etc ollama run (llm)

** Then run the python main script: RUN main8v2.py to run

Requirements are as follows: gradio json requests