Skip to content
View promptpanel's full-sized avatar
♥️
♥️
Block or Report

Block or report promptpanel

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
promptpanel/README.md

logo

PromptPanel
Accelerating your AI agent adoption
Documentation | DockerHub | GitHub

Installation

Via Docker Run

To get started running your first PromptPanel instance:

docker run --name promptpanel -p 4000:4000 -v PROMPT_DB:/app/database -v PROMPT_MEDIA:/app/media --pull=always promptpanel/promptpanel:latest

After running, your environment will be available at: http://localhost:4000

Read more on running PromptPanel.

Via Docker Compose + Local / Offline Inference

To run Ollama for local / offline inference with the following Docker Compose file:

curl -sSL https://promptpanel.com/content/media/manifest/docker-compose.yml | docker compose -f - up

which will run:

services:
  promptpanel:
    image: promptpanel/promptpanel:latest
    container_name: promptpanel
    restart: always
    volumes:
      - PROMPT_DB:/app/database
      - PROMPT_MEDIA:/app/media
    ports:
      - 4000:4000
    environment:
      PROMPT_OLLAMA_HOST: http://ollama:11434
  ollama:
    image: ollama/ollama:latest
    container_name: ollama
    restart: always
volumes:
  PROMPT_DB:
  PROMPT_MEDIA:

Your models, conversations, and logic are locked in walled-gardens.

Let's free your AI interface.

  • Run any large language model, across any inference provider, any way you want. From commercial models like OpenAI, Anthropic, Gemini, or Cohere - to open source models, either hosted or running locally via Ollama.
  • Access controls to assign users to agents without revealing your API tokens or credentials. Enable user sign-up and login with OpenID Connect (OIDC) single sign-on.
  • Bring your own data and store it locally on your instance. Use it safely by pairing it with any language model, whether online or offline.
  • Create custom agent plugins using Python, to customize your AI agent capabilities, and retrieval augmented generation (RAG) pipelines.

Build your own agent plugins

Get started developing using a one-click cloud development environment using GitPod:

Open in Gitpod

This ./plugins directory contains the community plugin agents found in PromptPanel as well as a sample agent as a template for you to get started with your own development.

  • The ./hello_agent directory gives you some scaffolding for a sample agent.
  • The other community plugin agents give you references to sample from.
  • The docker-compose-agent-dev.yml file gives you a sample with the various mounts and environment variables we recommend for development.

To get more information about how to build your first plugin we recommend giving a read to:

Running DEV_PORT=4000 docker compose up -f docker-compose-agent-dev.yml from this directory with a development port set will bring up a development environment you can use to start developing your agent plugin.

Command:

DEV_PORT=4000 docker compose up -f docker-compose-agent-dev.yml

With these settings, your development environment will be available at: http://localhost:4000

Questions?

Feel free to get in contact with us at:
[email protected]


App Screenshot

Development Experience via GitPod

Pinned Loading

  1. promptpanel promptpanel Public

    Your private, open-source, AI assistant platform. An open-source alternative to ChatGPT, Claude & Gemini. Run LLMs from OpenAI, Anthropic, Ollama, Mistral, Llama, and Google Gemini all together, al…

    JavaScript 41 1