LangGraph-GUI is a user-friendly interface for managing and visualizing Node-Edge workflows with LangGraph. It supports creating, editing, and running workflows locally using language models by Ollama .
This repo is implemented with Qt. If you want ReactFlow version frontend, see LangGraph-GUI-ReactFlow
This is node-edge based gui will export to json as saved graph. And run the graph by LangGraph.
If you want to learn more about LangGraph, we have LangGraph for dummy : LangGraph-learn
A graph(json) only have one start node, this will mapping to LangGraph START
Step Node will mapping to LangGraph add_node, you can drag edge from left node(right port) to right node(left port)
if you drag two node toward to each other, can create cycle.
Step node fill tool will call function, the tool definition need a tool node
Tool node need write real python function code, and need @tool
decorator like LangChain Custom Tools
CONDITION node will mapping to LangGraph conditional_edge
- green edge is true case path
- red edge is false case path
To install the required dependencies for the front-end GUI, run:
pip install PySide6
To install the required dependencies for the LangGraph, run:
pip install langchain langchain-community langchain-core langgraph
To start the front-end GUI, execute:
python frontend.py
This will allow you to read and write JSON files representing DAG workflows for CrewAI.
If want to run local llm, need run Ollama first
ollama serve
Then run the back-end locally with a model such mistral, use:
python backend.py --graph example.json --llm gemma2 --tee output.log
This command will parse the specified JSON file into Graph.
To build the front-end GUI into a standalone executable, follow these steps:
-
Install PyInstaller:
pip install pyinstaller
-
Navigate to the source directory:
cd src
-
Run PyInstaller with the necessary hooks:
pyinstaller --onefile --additional-hooks-dir=. frontend.py