OPENAI GRADIO UI WITH STREAM RESPONSE.
CAN BE USED WITH OLLAMA, OPENAI and other Private and Public LLM providers.
Simple GRADIO UI that runs local, so you can interact with your specific LLM.
This is a work in progress - check back regularly for updates
** First run Ollama locally with your LLM - mistral, codellama, etc ollama run (llm)
** Then run the python main script: RUN main8v2.py to run
Requirements are as follows: gradio json requests