inferit is a visual take on llm inference. Most inference frontends are limited to a single visual input/output "thread". This makes it hard to compare output from different models, prompts and sampler settings. inferit solves this with its UI that allows for an unlimited number of side-by-side generations. This makes it a perfect fit to compare and experiment with different models, prompts and sampler settings.
Some example use cases:
- model exploration and comparison
- prompt engineering
- sampler setting optimizations
Supported Backends:
- Any local or remote backend that is compatible with the OpenAI API
- Chrome's built-in llm (Gemini Nano)
- I've deployed an instance for instant access here: https://inferit.index.garden (online)
- Same code also ships as browser extension to run local and offline w/o having to start a process: https://chromewebstore.google.com/detail/inferit/celkhcifjknihgjlieolcmchofdloagk
- Run it locally from this repo:
pnpm install pnpm build pnpm preview
once running, set your api creds in the settings (stored on your device, window.localStorage) and u are good to go
pnpm dev
Contributions of any kind are warmly welcomed!
<3