Skip to content

mavacpjm/OPENAISTREAMUI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 

Repository files navigation

OPENAI GRADIO UI WITH STREAM RESPONSE.

CAN BE USED WITH OLLAMA, OPENAI and other Private and Public LLM providers.

Simple GRADIO UI that runs local, so you can interact with your specific LLM.

This is a work in progress - check back regularly for updates

** First run Ollama locally with your LLM - mistral, codellama, etc ollama run (llm)

** Then run the python main script: RUN main8v2.py to run

Requirements are as follows: gradio json requests

About

OPENAI GRADIO GUI WITH STREAMING

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages